We are traversing a (much-needed) period of increasing child privacy regulation. We have child-specific guidance for the GDPR, renewed interest in COPPA and the slew of US state student privacy laws, child-oriented provisions in the CCPA and CPRA, California’s age-appropriate design law, and forthcoming additions such as New York State’s “New York Child Privacy and Protection Act.” And that’s only those that I happen to think of off the top of my head, there are many more.
For child-directed services, the situation is often relatively clear: you have to treat your users as children, with privacy safeguards to match. But the question is somewhat more complex for services aimed at broad audiences: we would still like children to have their rights respected when using those.
The difficulty in doing this well resides in knowing that the user is a child. You have several options, none of which are much good:
- Today's system (typically in the US) in which you're on the hook primarily if you have actual knowledge that you're dealing with a child. This incentivises actors to do everything in their power never to so much as risk finding out that there is a child on their property. The moral hazard from actual knowledge standards is real. If you ever ask the child privacy expert at your outside counsel firm for the rules that you need to abide by in child privacy, you will get a thirty minute lecture on the many ways in which you can deny knowing there was a child there. When you finally get it through to them that you want to comply with child privacy regulations, they will stare blankly for a while and then announce that they need to do more research into how that would work. (True story.)
- An age-gating system that is basically cookie banners with some kind of age prompt. Needless to say, that’s horrendous from a user perspective. And that’s before you get into issues with the secondary data uses that age-gating management platforms engage in.
- A signal at the HTTP level (or similar) indicating that the current user is a child, or some similar mechanism flagging users as children. This is terrible on many levels: that type of information should absolutely not be made available to arbitrary parties, and specifically signalling this fact will lead to a degraded experience that will encourage children to seek ways to disable the signal. (At which they will succeed.)
What we want is a system that automatically triggers child privacy protections, without a user interface, but that also doesn’t reveal that a person is a child. Is that even possible? Yes, we can use the Spartacus method. Basically, you want to align the privacy rights of children with the privacy rights of any person who is using the Global Privacy Control, or GPC signal (eg. not sale of the data and no use of sensitive data, and there is no reason to believe that this couldn’t work just as well under the GDPR). Note that these privacy rights can be understood broadly for instance to include excessive engagement drivers or unfair nudging methods.
You then turn on GPC by default for children, ideally using knowledge that sits at the OS or generally the user agent level thereby signally that the corresponding rights must be afforded. This creates a system in which children benefit from much broader privacy protections but are hidden in a wider group (of privacy-conscious adults) such that no one can tell that they are children.
As part of this arrangement, it is important that services must not use the GPC for other types of age gating (eg. porn, alcohol) as that would constitute a loss of service that would violate the rights of the adults in the set.
This approach is simple to specify and simple for businesses to implement, especially since it aligns with compliance requirements that many of them already operate under.