Child-friendly changes coming to Google Search and YouTube
Image credit: Getty Images
Google has responded to pressure from governments and campaigners to provide a safer online environment for children by announcing a suite of wellbeing and privacy changes for children on search and YouTube.
Among the measures, videos uploaded to YouTube by users under the age of 18 will be set to private by default, meaning that they may only be viewed by themselves and users who receive a link. However, the uploader can still choose to adjust the settings, if they wish, to make the content available to the general public.
YouTube will also switch existing settings to disable autoplay and provide young users with break and bedtime reminders to discourage compulsive binge-watching.
Google is also planning to make changes to Google Search. New rules for Google Images will enable anyone under the age of 18 or their parent or guardian to request that their photo is removed from results.
Regarding data protection, Google will switch off location history – without the option to turn it back on – for all underage accounts. Previously, this feature has been limited to children with supervised accounts. Elsewhere, Google will expand the scope of safeguards to prevent age-sensitive ad categories from being displayed to teenagers. It will also block ad targeting based on age, gender or the interests of underage users.
These changes, which aim to give children “more control over their digital footprint”, will be rolled out to all Google products over the coming months, the company said.
The features have been announced ahead of new rules coming into force in the UK in September. The Age-Appropriate Design Code, developed by the Information Commissioner’s Office, will force technology companies to thoroughly implement child safety in their services. It aims to tackle intrusive data-gathering practices and manipulative tactics deployed to hold attention, such as by switching off geolocation by default and banning the use of nudges, which encourage children to provide unnecessary personal data.
Baroness Kidron, who chairs the child safety group the 5Rights Foundation, said: “These steps are only part of what is expected and is necessary, but they establish beyond doubt that it is possible to build the digital world that young people deserve and that when government takes action, the tech sector can and will change.”
Last week, Apple announced a new set of child safety tools including a new technology to limit the spread of child sexual abuse material. 'NeuralHash' hashes images which users wish to upload to cloud storage and compares the hash against a locally stored database of hashes associated with known child abuse material, flagging suspicious content for human review.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.