facebook

Facebook to employ 3000 extra to monitor platform for violent videos

Following criticism for allowing grisly scenes to be broadcast live on the platform, Facebook will employ thousands more people to speed up the removal of violent content.

Since the launch of Facebook Live last year – a feature which allows users to stream live video from anywhere in the world – some users have exploited the feature to share distressing content. The Wall Street Journal has reported that already at least 50 criminal or violent incidents have been broadcast on Facebook Live.

In April, the murder of a pensioner in Cleveland, Ohio, and a child in Thailand were broadcast and left on Facebook. Last week, the NSPCC released results of a study suggesting that social media companies could do more to keep distressing content away from children.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook – either live or in video posted later,” Mark Zuckerberg, co-founder, chairman and CEO of Facebook wrote in a blog post published yesterday. “It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.”

“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”

Facebook already employs 4500 people to review posts which could violate its terms of service. According to Zuckerberg, Facebook receives millions of user-submitted reports every week.

The 3000 new employees will monitor all Facebook content – not just live videos – to speed up the removal of content showing assault, murder, suicide, as well as hate speech and child exploitation.

According to Dr Sarah Roberts, assistant professor of information studies at the University of California-Los Angeles, employees who monitor content face difficult working conditions due to the hours they spend sifting through distressing material. Exposure to violent videos could lead to secondary trauma, a type of post-traumatic stress disorder.

While Zuckerberg has previously said that Facebook would be working on exploiting artificial intelligence to scan the platform for pornography, violence and other inappropriate content, he said yesterday that it would take years to reach a satisfactory quality. Increasing the number of Facebook staff monitoring content is a temporary solution to the problem.

While Facebook and other social media platforms have come under fire recently for the quantity and nature of unmoderated user-generated content being shared, a recent study suggests that much concern may be merely “alarmist panic”.

The Michigan State University study, which conducted a survey of 14,000 internet users and was focused on issues such as fake news online, suggested that citizens were capable of accessing quality information from a variety of sources.

“These findings should caution governments, business and the public from over-reacting to alarmist panics,” said Professor William Dutton, an internet studies expert at Michigan State University.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close