Robin Berjon

May The Enforcement Be With You

Fixing The GDPR In Flight

Mist over the water, with a barge in the distance

Towards the end of December, news enthusiasts can relish the awe-inspiring annual spectacle of Year in Review pieces migrating majestically across the media landscape. In the privacy ecosystem, a similarly-choreographed opining ballet takes place every year around May 25 as we collectively propound theories as to why the GDPR hasn’t solved the climate emergency and made us all good at karaoke just yet.

I don’t mean to catalogue everything that could be improved with Europe’s data protection apparatus, but only suggest a handful of changes that could make today’s regime more effective. This isn’t to say that changing regulations isn’t needed — the ePD could certainly use some work — and I understand the appeal of messianic claims that a child born today might well see the ePrivacy Regulation come to pass in their lifetime. But for now I simply wish to focus on more immediate concerns: what could DPAs or the EDPB/EDPS change to have a greater impact?

This means that what I have in mind has to be centred on enforcement strategies. Sure enough, countries could do more, DPAs could get better budget, we could see better coordination, but let’s be realistic: they are hopelessly outspent and I don’t see how that would change enough to make a dent. As I noted recently, if I read the numbers right, the CNIL has a budget of €0.35 per French citizen for all of its work whereas Google alone pays Apple $15 per user just to be the default search engine in iOS Safari. Even a 10x increase in budget would still keep the CNIL with a fraction of how much only one of the actors is willing to spend in customer acquisition costs for only one of its products. Multiply that by all the products at all the companies in the ecosystem and you get a sense of the task at hand: it’s a heroic miracle that enforcement is having an effect at all in the first place. DPAs should get more money and could probably coordinate better, but more money alone is not going to fix the issue.

To make the GDPR work, we need more structural change. A powerful way to produce structural change is to embed it into the technical architecture of the systems we use. I would like to see data regulators cooperate more with civil society and standards organisations to help weave data protection directly into technology. That’s the kind of work that the W3C’s highest technical instance, the TAG, has been doing with its draft Privacy Principles: building on today’s privacy scholarship and an operational understanding of modern data security, it can be understood as putting some flesh on Article 25, documenting real-world experience with user interaction, and driving support for purpose-limited technologies or built-in data minimisation.

The more data protection the technical layer can offer on its own, the more DPAs can focus on specific infringements instead of having to chase an unceasing data leak across everything. Civil society and the EDPB have the means to help each other through better coordination and a better understanding of each other’s problems and solutions.

A second improvement worth working on is addressing over-reliance on consent. The GDPR obviously includes consent as a legal basis (and the ePD has its own problems there) but guidance is set by the EDPB and is open to evolution. Consent has very valid uses, but whenever any level of complexity comes into play it becomes woefully inadequate (as explained at greater length in the Consent of the Governed keynote which I presented to the excellent COnSeNT 2022 Workshop). Consent offloads privacy labour to people, and offloading privacy labour to people creates the absurd dilemma that either a dialog’s design will be skewed to make people accept it without understanding or skewed to make people reject it without understanding. And while the latter indubitably affords greater data protection, the threshold for consent is so low and enforcement against gatekeepers so anaemic that most companies cannot make money after rejection. What happens next is predictable: DPAs start defending the view that cookiewalls are allowed to offer a choice between paying with money or paying with privacy.

Slow. Golf. Clap.

Inside of four short years we went from “Human dignity is inviolable” and “Everyone has the right to the protection of personal data1 to “le tarif est-il raisonnable?” (”is the price right?”)2.

DPAs can’t act on wishful dreaming (”contextual will save us!3) but they really, really shouldn’t be in the business of setting a price on human dignity and fundamental rights either. The solution is to stop obsessing over the wrong tool for the job. I know that there are consent fans out there, and it’s a nice theory of human autonomy — I only wish that theory were true. But just like any other technology, regulation must be judged on its impact, not its intentions. If it doesn’t work in practice, it doesn’t work in theory either. Consent has overwhelmingly proven itself inadequate to provide a human-centric legal basis for anything beyond really simple processing.

Instead, we need to develop sufficient institutional capacity to provide data protection by default on a context per context basis by establishing rules per industry (as understood by data subjects). By this I don’t mean the kind of pinky-promise “self-regulation” operated through a tangle of vague and ignored contracts that has been the preferred approach of adtech over the decades. I also don’t mean the sort of industry/DPA tête-à-tête that leads to the cookiewall compromise that the CNIL consented to above. I mean clear, effective, ethical, and transparent rules that are designed with industry expertise but most importantly under expert civil society oversight to produce enforced (and funded) Art. 40/41 codes that put data subjects first. It’s a new frontier, but we need to be creative.

If there’s a theme here it’s that DPAs should stop going it alone (and back channels with gatekeepers don’t count). Digital technology is everywhere, if it is to be safe it needs to be pervasively designed to be safe. Apart from a small coterie of antiprivacy activists, everyone would consider it absurd to eliminate transport-level encryption and to rely instead of a group of small, underfunded agencies to catch the mountainous volume of security breaches that would result from the universal transmission of data in the clear — that’s why we collectively enforce TLS/HTTPS as much as possible. Why would we expect data protection to be any different?

Finally, a topic that the EDPB could work on is the asymmetry of automation. It will never be fair that data controllers can automate everything about data collection but data subjects somehow have to handle everything manually. As I explain in GPC under the GDPR, we can make this work better with technology we already have (and providing the means to enforce sole controllership as GPC does would strongly benefit both data subjects and publishers). More complex but potentially useful as well (as explained towards the end of Consent of the Governed) is the ADPC. There are options, and with the right cooperation between regulators, civil society, and standards organisations, we can make them work for people.

And maybe, if we start working together, in the next Year in Review we can talk about what worked out this time around.