Towards the end of December, news enthusiasts can relish the awe-inspiring annual spectacle of Year in Review pieces migrating majestically across the media landscape. In the privacy ecosystem, a similarly-choreographed opining ballet takes place every year around May 25 as we collectively propound theories as to why the GDPR hasn’t solved the climate emergency and made us all good at karaoke just yet.
In this digital hellscape of ours, what is it that we talk about when we talk about privacy? We talk about power. Concentrations of data are concentrations of power, or, as the freshly-minted first public draft of the W3C’s Privacy Principles states, “asymmetries of information and of automation create commanding asymmetries of power.” That’s the problem to which privacy is the solution.
The W3C is nearing its 30th year of existence, and the Consortium’s community is working on reforming it. All of the proposals made to date assume that it will remain a membership organisation — I have a different suggestion.
For the longest time, word was that the Web would free the world; now, it seems as if most of the world wants to liberate the Web. I believe that we can tackle decentralisation and antitrust more effectively by developing a pragmatic and concrete approach to capture resistance inspired by the methods of cybersecurity.
If you've spent any amount of time discussing reforms to improve privacy online, you've likely encountered the Big Knob Theory. Like Covid it comes in variants, but its core tenet can be summarised thus: there exists (metaphorically) a Big Knob that can either be turned towards "privacy" or towards "competition" — but it's very much a zero-sum game and you can't have both. It's a popular position; but is it true?
There has been ample debate in some tech circles as to just how much of a privacy war is really being waged. My personal sense is that it's not so much of a war as it is a reality check. It has become very painfully obvious that the same old simple solutions don't work — and some people are up in arms that reality is being inconvenient to them.
The Global Privacy Control is making steady progress towards adoption. As a global signal supported by browsers, it's a natural question to ask what it means under regimes such as the GDPR. Here's my personal take.
Working behind the scenes in the news media sector, it has become increasingly clear to me that tech and the internet as they operate today are causing structural damage to our collective institutions that runs deeper than seems to be understood. But these changes can be hard to characterise. If you take advertising revenue from high-quality contexts and use it to subsidise conspiracy theories, it's pretty obvious that nothing good will happen — but we barely have an understanding of the data economy sufficient to put a critique of this transfer on solid footing. If you move editorial decisions about what information people get to see first from tens of thousands of career editors working with methods that are highly diversified in biases, culture, skill, or politics the world around to a tiny number of algorithms that are everywhere uniform, what effect can you expect? Reputational and market-driven accountability will be removed, which is evidently bad, but the massive simplification of this ecosystem seems very likely to have deep-running consequences of its own — but how do you begin proving that? A new paper in PNAS leads the way forward.
The British competition regulator (the CMA) just released a draft agreement with Google relating to the "Privacy Sandbox". I take a quick look at it through the lens of enabling better standards and stronger cooperation between the world of standards and policy.
It's time to lift the coronavirus travel ban — ideally entirely but failing that at least for vaccinated non-immigrant visas.