Just before Christmas, the EU’s data protection authorities (DPAs)—working together as the EU Data Protection Board (EDPB)—issued the hotly anticipated Opinion on AI models and EU privacy law. As it turned out, the excitement about the Opinion was unjustified. The Opinion, in a classic EDPB style, gives a long list of things that AI developers and users can attempt to comply with the GDPR while giving no guarantees that all that effort will be seen as sufficient by all EU privacy enforcers. Given the very limited usefulness of the Opinion, I decided to focus on what I see as the real issue: the structural inadequacy of the GDPR enforcement mechanism. The political economy of GDPR enforcement has proven not suitable to deliver what Europe needs: the protection not just of one fundamental right, but a holistic approach that genuinely balances the vital interests.
The Problem of Regulatory Uncertainty
Yes, the Opinion does not say that AI is illegal in the EU, but let’s be honest: even in the EU, explicitly making such a declaration is politically unpalatable. Instead, the EU privacy enforcers did what they usually do. They kept as much enforcement flexibility for themselves as possible, opening the doors for any EU national enforcers to impose billions-worth fines. Of course, the other side of that coin is that those who want to use AI in the EU have no idea if all their GDPR compliance efforts will be judged as not good enough in a year or two. If you have a choice, why wouldn’t you simply avoid the EU for an AI startup idea given this?
Some privacy regulators may protest that this was not their intent; after all, they did provide the list of things to try. But such an answer would show the fundamental disconnect from economic reality. Consciously or not, regulators can thwart development not only by explicitly banning it but also by creating an environment of uncertainty. Even the threat of discretionary regulatory enforcement—combined with the risk of heavy fines—can significantly chill investment decisions (and thus innovation) at the margins.
The Alternative: Clear Rules
Regulators in many areas are not afraid to publicly commit to non-enforcement if certain clear conditions are met. These commitments, which may take the form of official "safe harbors," can be revised later if circumstances change. This approach allows regulators to foster a conducive environment for development without changing existing legislation.
Providing clarity may require creativity and risk-taking. However, in the EU, much of the creativity and risk-taking in privacy enforcement is directed in the opposite direction. It is directed toward tightening restrictions and maximizing what the regulators consider privacy protections. (There are some notable exceptions, like some of the work done by the French and Hamburg regulators on AI - which I discussed in an earlier newsletter. The problem with those exceptions is that they consistently remain exceptional, while the privacy maximalist attitude cheerfully dominates and overrules dissenters). If privacy regulators claim they have no legal authority to act otherwise, we should remember how creatively they reinterpret the GDPR to achieve their preferred outcomes. The issue is not about adhering strictly to the law, but about the goals deemed essential to pursue.
In my first LinkedIn reaction to the Opinion, I said: “If all this is how EU privacy regulators respond to the Draghi report, then perhaps we should indeed talk more seriously about GDPR enforcement reform, but not of the kind that they will like.”
So, let’s talk about reform.
Diagnosis—Privacy Myopia as a Structural Problem with GDPR Enforcement
Post-GDPR, the privacy regulators have immense powers to affect many issues (to the extent of significantly hampering Europeans’ economic security) without a robustly enforced responsibility to seriously care about the consequences of their actions.
DPAs appear to be driven by the belief that their sole responsibility is to maximize privacy and data protection. They feel that it is the responsibility of regulated businesses to demonstrate—against an extremely high standard of proof—that any other interests might justify deviating from the most privacy-focused approach. While DPAs acknowledge the need to pay lip service to GDPR's Recital 4, which addresses proportionality and the non-absoluteness of privacy, they do not consider it to be central to their role.
One key point this approach misses is that businesses have interests that overlap but are not identical to those of individuals. For instance, to some extent, promoting freedom of expression and access to information may align with business interests. However, relying on privacy law as a moat against potential competitors may also align with business interests.
This model of privacy enforcement obviously fails to adequately represent the non-privacy rights and interests of individuals which do not happen to align with business interests. (Less obviously, it also fails to protect the interests of individuals that align with business interests—largely due to the extreme skepticism of any non-privacy arguments.)
Such non-privacy interests are also not adequately protected by the more or less loose involvement of other authorities (e.g., competition authorities). (I’ll set aside the role of the courts for another day.) This is largely because some vital interests, like economic security, are only fully in the purview of political authorities, whose involvement under the current interpretation of “independence” of DPAs is seen as anathema. Hence, the EDPS’ idea of a Digital Clearinghouse for the collaboration of privacy, competition, and consumer protection authorities does not adequately address the concern, although it may be a small, insufficient step in the right direction.
This isn't a secret. Capable individuals working for DPAs are aware of this. How do they rationalize it? Publicly, they pay lip service to privacy's non-absoluteness and the need for proportionality. Privately, I suspect they'd simply assert that privacy outweighs all else and it’s not a big deal that the current enforcement system is inherently designed not to balance privacy with other values but to privilege it. This bias contradicts the law's letter, highlighting how enforcement practices and incentives often overshadow the law's best interpretation.
Pathways for Reform
What could be done to improve this?
I’m not convinced that substantive GDPR reform is the most pressing issue. The GDPR’s text is much more flexible in accommodating genuine balancing of important interests than enforcement practice would suggest.
However, a reform of GDPR enforcement would likely be necessary.
One possibility would be to abandon the current interpretation of DPAs’ independence and recognise that balancing vital interests, including fundamental rights, is a political question, and thus, enforcement should be done by directly politically accountable officials.
Another possibility would be to retain something closer to the current interpretation of the independence of regulators but give strong enforcement powers only to authorities with sufficiently broad mandates and accountability. The mandates and accountability should be sufficiently robust to make it credible that all significant enforcement consequences will be seriously considered. This should include the impacts on economic growth (read: security), innovation, freedom of speech, and so on. If we give an authority the power to fine businesses 4% (or 10% - thinking about the DMA) of their turnover (not profit!), the authority should be credible and accountable in weighing all the consequences.
So, perhaps the power to sanction should be transferred from privacy enforcers to a different level of “innovation and rights protection authorities”? Or maybe privacy enforcers should be only empowered to bring lawsuits before ordinary courts and attempt to convince judges with broader experiences than just a myopic focus on privacy?
Much more thought is needed on the details of GDPR enforcement reform. One thing is clear: the currently proposed reform entirely misses the real issues.
Conclusion
As exemplified by the recent EDPB Opinion on AI models, the current privacy enforcement regime in the EU continues to prioritise maximum privacy protection at the expense of other crucial interests. While the GDPR provides flexibility for balancing competing interests, its enforcement mechanism requires significant reform. Whether through reinterpreting DPA independence or taking away direct sanctioning powers from DPAs, the path forward must involve establishing a more balanced, predictable, and holistic approach to privacy regulation.
Perfect !