Facebook moderators accused of turning blind eye to abusive and racist content
According to a Channel 4 investigation, Facebook moderators have been treating abusive, violent and racist content with leniency, even when the content explicitly violates its guidelines.
For its Dispatches investigation, Channel 4 sent an undercover reporter to work for six weeks at Cpl resources – Facebook’s largest centre for British content moderation – during which they attended training and filmed conversations.
In one key incident, a moderator marked a video of an adult punching and stamping on a distressed toddler, but did not remove it, instead retaining it as an example of “acceptable content”. In another incident, trainees were shown an image of a Caucasian woman drowning a child, captioned “When your daughter’s first crush is a little negro boy” and were advised that this type of content should be allowed to remain online.
“If you start censoring too much, then people stop using the platform. It’s all about money at the end of the day,” a moderator caught on tape commented.
Facebook told Channel 4 that the video and image violated their guidelines and should both have been removed. The company said that these incidents were “mistakes” which do not reflect its policies or values.
The weeks-long investigation concluded that while nudity is almost always removed, content associated with racial hatred, violence and self-mutilation was treated more leniently, even when uploaded by underage users. In one incident featured in the programme, a moderator chose to ignore a video stating that: “Muslim immigrants should f**k off”, while the investigation also found that extremist pages with large followings (such as that of far-right figurehead Tommy Robinson) are treated with special consideration due to the considerable revenue that they generate for Facebook.
“These revelations about Facebook’s content moderation are alarming, but not surprising,” said Julian Knight, a member of the Digital, Culture, Media and Sport Select Committee, in a statement. “Facebook has recently committed to reducing fake news and improving privacy on its platform, which is welcome, but they don’t seem as committed to sacrificing profits made from extreme content, as is demonstrated by Channel 4’s investigation.”
Lord Allan, Facebook’s director of policy in Europe, said in a statement that: “It’s clear that some of what is shown in the programme does not reflect Facebook’s policies or values and falls short of the high standards we expect. We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention.”
“Where we know we have made mistakes, we have taken action immediately. We are providing additional training and are working to understand exactly what happened so we can rectify it.”