The U.K. could be gearing up to hit a handful of tech firms with enforcement orders (and potentially fines) related to a children’s online privacy and safety Code which has been in force for a year.
“The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes,” the data protection watchdog said in a blog post yesterday marking the one-year anniversary of the Code coming into application.
The Telegraph, which has interview with information commissioner, John Edwards — who heads up the Information Commissioner’s Office (ICO) — in today’s paper reports that two of the four social media and tech firms under investigation are household names.
Its reports says decisions by the ICO on whether to prosecute are expected to be announced within weeks.
“This code makes clear that children are not like adults online, and their data needs greater protections,” Edwards told the Telegraph. “We’ll use our enforcement powers where they are required.”
The companies in question have not been named — either by the newspaper or the ICO — but last November, the watchdog wrote to Apple and Google after concerns had been raised with it about how the pair assess apps on their respective mobile app stores to determine which age ratings they apply.
The ICO described its outreach then as an “evidence gathering process to identify conformance with the code” — although it remains to be seen whether the two tech giants are among the four firms facing possible enforcement, or if they’re just among the wider group whose compliance the watchdog has been eyeing.
“Unfortunately, we are unable to name the companies at the minute due to ongoing investigations,” a spokeswoman for the ICO confirmed when asked if it can share any more details.
The ICO first published the children’s Code back in 2020. It contains 15 standards for what’s billed as “age appropriate design” — essentially it’s a set of design recommendations for web services that are likely to be accessed by kids, containing recommendations such as setting high privacy defaults and not using heavy-handed engagement tactics that could keep kids unhealthily hooked on using a digital service.
The overarching aim is for the Code to encourage to platforms to safeguard kids from accessing inappropriate content and prevent them being commercially data-mined, although the ICO regulates personal data (rather than content) — the latter responsibility will fall to Ofcom under the incoming Online Safety Bill (assuming another change of U.K. prime minister does not lead to a legislative rethink on that front).
This division of regulatory responsibilities has led to some friction from children’s safety campaigners who, while supportive of the Code — and, indeed, even more than that in the case of 5Rights’ chair and life peer, Baroness Kidron, who was a fundamental driver for adoption of the standards (and continues to press for amendments from her seat in the House of Lords) — have complained of “gaps”, as they wait for content-focused safety laws to make their way through parliament.
The ICO has therefore faced pressure to also be looking at adult websites — i.e. by requiring that porn sites also comply with the Code — not just auditing the sorts of games and social media apps that are most obviously popular with children.
Age checks for porn sites?
The overarching push by child safety campaigners is to force adult websites to apply robust age checks to prevent children accessing online pornography — so, basically, a revival of a mandatory age checks for porn sites policy that’s been kicked about by U.K. lawmakers for years — most recently revived (earlier this year) as an(other) addition to the Online Safety Bill after a standalone age check scheme was dropped in 2019 after facing criticism that it was unworkable.
Campaigners may finally be scenting victory on this front, via the Online Safety Bill, as the government said in February that it will mandate the use of “age verification technologies” on adult sites to make it harder for children to access or stumble across pornography. But they’re evidently not sitting on their hands waiting for that legislation to pass — not when the Children’s Code and U.K. data protection law already exists for them to leverage…
And in what looks to be a related change to its approach, announced yesterday, the ICO has bowed to pressure to expand its interpretation of the Code to cover pornography websites — or at least those that are “likely” to be accessed by children (whatever that means) — writing in its blog post that: “We have… revised our position to clarify that adult-only services are in scope of the Children’s code if they are likely to be accessed by children.”
The ICO says this evolution in how it applies the Code follows petitions by child safety campaigners and others warning of the risk of “data protection harms” when kids access porn sites.
“We will continue to evolve our approach, listening to others to ensure the code is having the maximum impact,” it goes on. “For example, we have seen an increasing amount of research (from the NSPCC, 5Rights, Microsoft and British Board of Film Classification), that children are likely to be accessing adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms.”
This change in application does not (cannot) entail an expansion of what the ICO regulates to include content itself. (“We don’t regulate content,” its spokeswoman confirmed. “We regulate how children’s personal data is used or processed in order for content to be served to children. It’s the step before children see the content.”)
However it’s clear that porn sites’ data collection habits are not the primary concern for child safety campaigners — rather it’s, yep, the content — but if campaigners can leverage children’s privacy rules to force porn sites to implement age checks they don’t look too fussy.
In a statement welcoming the ICO’s revision to include adult-only sites in scope of the Code, children’s safety campaign group, the 5Rights Foundation, said:
The UK Age Appropriate Design Code applies to all services that are likely to be accessed by under-18s, even if they are not intended for children. Through its investigative work submitted to the ICO last year, 5Rights uncovered that sites including gambling, dating and pornography sites are being accessed by children and are not complying with the Code, in particular profiling children to serve detrimental material.
“The ICO’s announcement on adult-only sites will provide much needed clarity to those companies who think they are beyond the law,” added Duncan McCann, its head of policy implementation, in another supporting statement. “They will no longer have grey lines to exploit, and we hope that this development will serve to further improve the online lives of young people.”
While the U.K. children’s Code itself is not legally binding, it is attached to the country’s wider data protection rules — which include the Data Protection Act and U.K. GDPR — and ICO guidances notes that applicable online services “need to follow” the standards in order to “ensure they are complying with their obligations under data protection law to protect children’s data online”.
Under the GDPR, the ICO has extensive powers to enforce against privacy breaches — with the ability to fine infringers up to 4% of their global annual turnover (or up to £17.5M, whichever is higher). So the subtext here is basically ‘comply with the code or risk GDPR-level enforcement’ — giving the ICO a big stick to encourage in-scope digital services to apply goldplating rules that could end up in an age-gated internet, since who knows which other services might be “likely” to be accessed by kids?
Asked how adult websites should assess whether children are likely to access their services, the ICO’s spokeswoman responded with this: “Services must be accountable for their decisions, and be able to provide evidence to support their views on whether they are likely to be accessed by children. To determine if they fall within the scope of the code, adult services will need to understand who their users are, and identify if children make up a significant number of those users. To do this, online service could undertake research about their users, review academic research or commission market research, consideration of the types of content and activities children are interested in and the attractiveness of their services to children; or consider if children are known to like similar services.”
The phrase “understand who their users are, and identify if children make up a significant number of those users” is doing a lot of work in that sentence — although the ICO has not explicitly suggested the use of age verification technology as a way for a service to determine whether it falls in scope of the Code. That comes next…
“If an adult only online service is likely to be accessed by children, the service needs to take measures to restrict children from accessing the service, such as by implementing age assurance measures, or it must implement the standards of the code in a proportionate, risk-based manner to protect children’s privacy online,” the ICO’s spokeswoman also told us, adding: “It’s vitally important to look after children online and not treat them in the same way adults are treated. It is a long term, transformative process to embed the Children’s code but we are seeing more and more change which is good for children, it allows the online industry to be more innovative and it’s the right thing to do.”
The ICO’s blog post also notes that the (privacy) regulatory will be working with Ofcom (the incoming content regulator) and the Department for Digital, Culture, Media and Sport (DCMS) to “establish how the code works in practice in relation to adult-only services and what they should expect”. So expect further implementation ‘evolution’ as more pieces of the U.K.’s digital regulation strategy land (or, well, fall away).
The ICO is already taking credit for a number of policy tweaks applied by major platforms to children’s accounts, including Facebook, Instagram, YouTube, Google and Nintendo, over the past year — such as the Meta-owned platforms limiting targeting to age, gender, and location for under-18s; and YouTube turning off autoplay by default and turning on take a break and bedtime reminders by default for Google Accounts for under 18s, to name two of the actions it flags.
The U.K. Code has also been credited with encouraging similar policy moves in other jurisdictions — reportedly inspiring a California bill that was passed by lawmakers just this week (and will, if it’s gets signed into law, apply a similar set of protections for under-18s in the state), among a number of other moves by other regulators and policymakers focused on safeguarding kids online.