Tech firms given deadline to meet new child privacy protection rules
Image credit: Manop Phimsit/Dreamstime
A UK watchdog has told tech firms that they have 12 months to ensure their platforms adhere to new child privacy protection measures or face enforcement action, including hefty fines.
The Age Appropriate Design Code sets out 15 standards that companies must build into any online services used by children, making data protection of young people a priority from the design up. The code says these can stretch from apps and connected toys to social media sites and online games, and even educational websites and streaming services.
Organisations that fail to follow the code after the transition period ends on 2 September 2021 could face the Information Commissioner’s Office and enforcement action, which includes compulsory audits, orders to stop processing, and fines of up to 4 per cent of global turnover.
Under the rules, privacy settings must be set to high by default, and nudge techniques should not be used to encourage children to weaken their settings. The code also states that location settings that allow the world to see where a child is should be switched off by default. Furthermore, data collection and sharing should be minimised, and profiling that can allow children to be served-up targeted content should also be switched off by default.
“A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online,” said Information Commissioner Elizabeth Denham. “This code makes it clear that kids are not like adults online, and their data needs greater protections. We want children to be online, learning, and playing and experiencing the world, but with the right protections in place.”
Denham added: “We do understand that companies, particularly small businesses, will need support to comply with the code, and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.”
Andy Burrows, head of child safety online policy at the NSPCC, said the move will force tech firms to “take online harms seriously” so there can be “no more excuses for putting children at risk”.
“For the first time, high-risk social networks will have a legal duty to assess their sites for sexual abuse risks and no longer serve up harmful self-harm and suicide content to children,” he said. “The government must also press ahead with its Online Harms Bill to go hand in hand with the Code. This must be enforced by an independent regulator with the teeth it needs to hold platforms to account for safety failings.”
The safety of children online has been under the radar of data watchdogs and children-focused charities over the last few years. Journalist Helena Pozniak has explored whether tech firms are doing enough to protect children on their sites, especially with the growing consumption of social media by young children during the months of the coronavirus lockdown.
In June, a number of tech firms, including the likes of Facebook, Google and Twitter, announced a new joint venture, Project Protect, designed to better tackle child sexual abuse content online. Meanwhile, back in March, a group of governments and major technology firms agreed on a set of actions that firms must take to safeguard children on the internet.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.