Advertisers boycott YouTube in midst of child exploitation controversy
Image credit: Aleksey Boldin | Dreamstime.com
Major brands such as Disney, Nestlé and Epic Games have pulled advertisements on YouTube after evidence showed the site is being used as a medium to facilitate the activities of a paedophile ring, raising questions over its ability to block predatory behaviour on content featuring young children.
The companies acted after a YouTube user, by the name of Matt Watson, posted a video to the site this week to point out this behaviour – exposing individuals for finding videos of young children and using the comment section to talk about the children’s bodies.
The video highlighting the comments has been viewed over 1.9 million times since it was uploaded on 17 February 2019. Watson has accused YouTube of “facilitating the sexual exploitation” of children, adding that its recommendation system has also guided predators to other similar videos of minors – many of which carry advertisements for major brands.
Many of the advertisers identified in the video – Fortnite creator Epic Games, GNC and Nestlé companies in the US – said they have suspended advertising on the video streaming site, with Bloomberg News reporting that the Walt Disney Company has also halted ads.
“When we learned of this issue, we were – and still are – absolutely horrified and reached out to YouTube to rectify this immediately,” said Senka Hadzimuratovic, a spokeswoman for the online grammar tool Grammarly. “We have a strict policy against advertising alongside harmful or offensive content and would never knowingly associate ourselves with channels like this. It goes against everything our company stands for.”
According to Watson, videos targeted by paedophiles – featuring young girls doing gymnastics, playing Twister or stretching – did not violate YouTube’s rule, for the most part. However, these videos became overrun with suggestive remarks directed at the children.
Furthermore, the commentators left time stamps for parts of the video that can appear compromising when paused – for example, a young girl’s backside or bare legs. They also posted remarks that praised the individuals in the videos, asking whether they were wearing underwear, or simply carrying a string of sexually suggestive emojis.
Chi Hea Cho, a spokeswoman for YouTube’s parent company, Google, said it had deleted over 400 accounts and channels of people leaving the disturbing comments, deleted comments that violate its policies and reported illegal activity to the authorities. Cho has also reported illegal comments to the National Centre for Missing and Exploited Children.
“Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” said Cho. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
A BBC Radio 4 investigation published on 20 February found that some criminals are trading child sex abuse images using encrypted apps such as Telegram, with some of these criminals using a method of promoting their groups in the public comment sections of YouTube videos.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.