Ofcom powers up to police video-sharing platforms
Image credit: Dreamstime
Ofcom has issued new, stricter guidelines for video sharing platforms (VSPs) in a bid to protect users from harmful content.
Under the new measures, VSPs including TikTok, Snapchat, Vimeo and Twitch, are required by law to take measures to protect under-18s from potentially harmful video content. All users must be protected from videos likely to incite violence or hatred, and certain types of criminal content.
Past research from the regulator had found that a third of users said they had been exposed to hateful content, while a quarter said they had seen unwanted violent or disturbing content on the platforms. One in five said they had seen videos or content that encouraged racism.
Ofcom said it had already begun discussing with the VSPs what their responsibilities are and how they should comply with them. While the body will not be monitoring content itself like it does with TV broadcasts, the laws lay out measures that providers must take to protect their users.
The Internet Watch Foundation reported a 77 per cent increase in the amount of “self-generated” abuse content in 2020, with porn sites carrying a heightened risk of child sexual abuse material and the rise in direct-to-fans subscription sites potentially made this risk more pronounced.
“Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them,” said Dame Melanie Dawes, Ofcom’s chief executive. “The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”
If Ofcom discovers that a VSP has breached its obligations, it has the power to investigate and take actions against it including fines, requiring the provider to take specific actions, or even suspending the service in the most severe cases.
The regulator's remit currently covers platforms established in Britain, numbering 18 initially. Platforms established in other countries, such as YouTube and Facebook, are excluded.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.