School children drawing rainbow and cloud on the chalkboard

ICO design code prioritises children’s privacy online

Image credit: Tom Wang - Dreamstime

Standards that would force tech companies to prioritise children’s privacy online have been published by the UK’s data regulator.

The final Age Appropriate Design Code has been published by the Information Commissioner’s Office (ICO), which hopes will come into effect by the autumn of 2021 pending approval from parliament.

All sectors within the tech industry, from apps to connected toys, social media platforms to online games, and even educational websites and streaming services, will be expected to make data protection of young people of primary consideration, beginning with design.

The 15 provisions outlined by the ICO have been “clarified and simplified” since a draft was first revealed in April 2019, after consulting with the industry and then being submitted to the Government the following November.

The measures outlines include areas such as transparency, default setting and data minimisation. The code states that privacy settings should be set to high by default and nudge techniques (indirect suggestions) should not be used to encourage children to weaken their settings. It also says location settings that allow the world to see where a child is, should also be switched off by default. Furthermore, data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

“I believe it will be transformational,” said information commissioner Elizabeth Denham. “I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online. I think it will be as ordinary as keeping children safe by putting on a seat belt.”

Denham stated that the gaming industry and some other tech companies, however, expressed concerns regarding their business models, but overall the move was widely supported by them.

She added: “We have an existing law, GDPR, that requires special treatment of children and I think these 15 standards will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media.”

The code drawn-up by the ICO comes at a time of increased pressure on the tech industry to take responsibility for its possible impact upon people’s mental health.

Ian Russell, who believes the promotion of suicide-related content on social media helped his teenage daughter Molly take her own life in 2017, has also welcomed the code: “It is shocking that in failing to make the necessary changes quickly enough, the tech companies have allowed unnecessary suffering to continue,” he said. “Although small steps have been taken by some social media platforms, there seems little significant investment and a lack of commitment to a meaningful change, both essential steps required to create a safer world wide web. The Age Appropriate Design Code demonstrates how technology companies might have responded effectively and immediately.”

Andy Burrows, head of child safety online policy at the NSPCC, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.

He added: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and no longer serve up harmful self-harm and pro-suicide content. It is now key that these measures are enforced in a proportionate and targeted way.”

Facebook, which has been under extreme pressure over its approach towards the safety of its users, said: “We welcome the considerations raised by the UK Government and the Information Commissioner on how to protect young people online. The safety of young people is central to our decision-making, and we’ve spent over a decade introducing new features and tools to help everyone have a positive and safe experience on our platforms, including recent updates such as increased direct message privacy settings on Instagram. We are actively working on developing more features in this space and are committed to working with governments and the tech industry on appropriate solutions around topics such as preventing underage use of our platforms.”

The 15 measures outlined by the ICO are:

1) Best interests of the child: The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child.

2) Data protection impact assessments: Firms should “assess and mitigate risks to the rights and freedoms of children” who are likely to access an online service.

3) Age-appropriate application: A “risk-based approach to recognising the age of individual users” should be taken.

4) Transparency: Privacy information provided to users “must be concise, prominent and in clear language suited to the age of the child”.

5) Detrimental use of data: Children’s personal data must not be used in ways that have been “shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice”.

6) Policies and community standards: Uphold published terms, policies and community standards.

7) Default settings: Settings must be set to “high privacy” by default.

8) Data minimisation: Collect and retain “only the minimum amount of personal data” needed to provide the elements of the service in which a child is actively and knowingly engaged.

9) Data sharing: Children’s data must not be disclosed unless a compelling reason to do so can be shown.

10) Geolocation: Geolocation tracking features should be switched off by default.

11) Parental controls: Children should be provided age-appropriate information about parental controls.

12) Profiling: Switch options which use profiling off by default.

Profiling should only be allowed if there are “appropriate measures” in place to protect the child from any harmful effects, such as content that is detrimental to their health or wellbeing.

13) Nudge techniques: Do not use nudge techniques to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.

14) Connected toys and devices: Connected toys and devices should include effective tools to ensure they conform to the code.

15) Online tools: Children should be provided with prominent and accessible tools to exercise their data protection rights and report concerns.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles