Data Fines Aren’t Justice — They’re Just High-Stakes Monopoly Money
TL;DR:
Big tech firms are being hit with record fines for data misuse and anti-competitive practices, but the people whose data was compromised rarely see a penny. While enforcement headlines look impressive, regulators pocket the penalties and leave victims to navigate identity theft, legal hurdles, and red tape alone. It’s time for a fairer system, one that offers restitution, not just retribution.
When Fines Fall, Who Benefits?
The European Union recently fined Apple €500 million and Meta €200 million under the Digital Markets Act for anti-competitive behaviour and manipulative data practices. These fines follow a record €1.2 billion penalty against Meta in 2023 for unlawful data transfers under the GDPR (https://www.theguardian.com/technology/2023/may/22/facebook-fined-mishandling-user-information-ireland-eu-meta). In the UK, British Airways was fined £20 million in 2020 after a data breach exposed over 400,000 customers’ personal and financial data (https://www.theguardian.com/business/2020/oct/16/ba-fined-record-20m-for-customer-data-breach).
But here’s the uncomfortable truth: the people whose data was mishandled rarely, if ever, see a penny of these penalties. The fines almost always end up in government budgets — not in the pockets of the victims.
Where Does the Money Actually Go?
In the EU, fines collected under GDPR or the Digital Markets Act go to the EU’s general budget. In Ireland, which hosts many big tech European HQs, regulators like the Data Protection Commission (DPC) hand over penalty revenue to the national exchequer.
In the UK, fines issued by the Information Commissioner’s Office (ICO) go directly to the Treasury’s Consolidated Fund. The ICO doesn’t even retain the funds it generates through enforcement.
There are no legal mechanisms in either jurisdiction to ensure victims receive direct financial compensation from these fines. A company can mishandle your personal health data, your child’s biometric profile, or your entire shopping history, and regulators might act. But your reward is silence.
Contrast: US Class Action Lawsuits
In the United States, although class action settlements often deliver modest individual payouts, they at least exist. In 2021, Facebook paid $650 million to settle a biometric privacy lawsuit under Illinois’ BIPA law, with individual claimants receiving around $345 each (https://www.theguardian.com/technology/2021/feb/27/facebook-illinois-privacy-lawsuit-settlement).
In another example, Equifax’s 2017 data breach settlement set aside $425 million for victims, offering free credit monitoring or up to $125 in cash (https://www.ftc.gov/enforcement/refunds/equifax-data-breach-settlement).
Class actions aren’t perfect, but at least they include the affected public in the process. In the UK and EU, consumers are passive observers while tech giants and regulators fight multi-million-euro battles.
The Ponemon Illusion: Costly Records, No Compensation
Security professionals regularly cite Ponemon Institute and IBM’s Cost of a Data Breach Report. In the 2024 edition, IBM estimates the average cost per breached record as:
- Customer PII: $169 per record
- Employee PII: $181–189 per record
- Intellectual Property: $173 per record
https://www.ibm.com/reports/data-breach
However, these are not payouts to victims. They represent corporate costs: legal, regulatory, PR, and technical responses. The individual whose data was exposed sees none of it. These numbers are designed for boardroom risk assessments, not restitution.
GDPR: Theoretical Rights, Missing Restitution
GDPR Article 82 gives individuals the right to sue for material and non-material damages, but in practice, how many people do? These lawsuits are costly, opaque, and underused. Most individuals never even realise they could claim damages, let alone win them.
And even if one case succeeds, it’s usually not linked to the regulatory fine. The public sees the headlines. The regulators see the revenue. The companies write it off as a cost of doing business. And the victims? They carry on.
Many people, myself included, have exercised their right to submit Subject Access Requests (DSARs) to both public and private sector organisations. In a shocking number of cases, responses are delayed, incomplete, or outright ignored. When the Information Commissioner’s Office is notified, they may investigate and issue a finding of non-compliance, and then stop. The ICO does not pursue further enforcement or offer compensation; instead, it’s left to the individual to take legal action, which is often costly, stressful, and inaccessible for most people. Organisations know this, and act accordingly.
Worse still, when individuals suffer actual harm, such as identity theft or data breaches, they are left to clean up the mess themselves. Placing fraud markers like CIFAS can introduce complications when you later need to switch service providers, such as applying for a new mobile contract or changing broadband supplier. These added verification hurdles, while important for fraud prevention, often penalise the genuine person who is already dealing with the stress and consequences of identity theft. Victims may spend hours navigating bureaucracy, revalidating their identity, or recovering stolen funds. The emotional and practical toll is enormous. Despite this, the enforcement system remains fundamentally impersonal and passive.
Time for Change: Public Harms Deserve Public Compensation
Consider this, the UK once ran entire compensation schemes for financial mis-selling, like PPI (Payment Protection Insurance), that returned billions to consumers. These were publicly promoted, regulated, and easy to join… Yet, no such scheme exists for data breaches or privacy violations. Despite personal data being exposed, stolen, or mishandled, victims have no automatic route to compensation. Why the double standard? If data is now the most valuable asset in the digital economy, shouldn’t its misuse carry a comparable consumer redress framework?
There is no digital equivalent of the Financial Services Compensation Scheme or a consumer redress fund for data violations. No opt-out infrastructure funded by fines. No community support schemes for victims of major data exposure events.
We need a rethink:
- Ringfence a portion of fines for victim compensation or public education.
- Create opt-out tools for surveillance models as part of enforcement.
- Fund digital rights groups to raise awareness and provide legal aid.
Otherwise, What Are We Really Enforcing?
According to IBM, the global average cost of a data breach in 2024 reached $4.88 million, the highest year-over-year increase since the pandemic. Organisations that fail to use AI and automation pay significantly more: up to $5.72 million per breach, compared to $3.84 million for those who do, a savings of over $2 million. Meanwhile, breaches involving shadow data average $5.27 million, 16.2% more than breaches without it. https://www.ibm.com/reports/data-breach
These statistics highlight the financial burden borne by companies. But the same reports also highlight what’s missing: meaningful restitution to the people whose data was actually exposed.
If fines simply shuffle large sums between regulators and treasuries while victims remain empty-handed, the system will continue to feel unjust. Especially when those victims were told they had control over their data.
Right now, it’s hard to distinguish data protection enforcement from a very lucrative, high-stakes game of Monopoly. And in this game, the people landing on the wrong squares never seem to pass Go… or collect anything at all.


1 Comment »