facebook

Facebook agrees to pay out $52m to traumatised content moderators

Facebook has agreed to pay $52m (£42m) to current and former content moderators in a preliminary settlement. Facebook has also agreed to provide more counselling opportunities; to screen candidates for emotional resiliency, and to introduce tools to lessen the psychological impact of viewing graphic violence.

Social media companies have been under pressure to stamp out harmful content on their platforms, including state-backed disinformation, murder and torture videos and images of child exploitation.

Facebook has attempted to stem the proliferation of this content with a combination of automated detection systems and thousands of human moderators (employees and contractors) who view content and decide whether or not it should be allowed to remain on the platform.

In recent years, some content moderators have spoken out about the mental and emotional strain caused by viewing hundreds or thousands of posts featuring explicit acts such as murder, paedophilia, rape, torture, bestiality, executions and suicide every day.

A Facebook content moderator told The Guardian: “You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”

The Verge reported last year that some people hired through Cognizant to moderate content for Facebook can be fired for making a small number of errors; are micro-managed with just minutes a day of “wellness time”; develop PTSD-like symptoms; routinely use drugs to numb their emotions, and begin to embrace the extremist viewpoints of some of the content they moderate, such as Holocaust denial. Cognizant announced afterwards that it would wind down its content moderation business.

In September 2018, former content moderator Selena Scola filed a class-action lawsuit against Facebook, alleging that regularly viewing images and videos of rape, murder and suicide caused her to develop post-traumatic stress disorder (PTSD) symptoms after nine months. She was joined by a number of other former Facebook content moderators who alleged that the social media giant had failed to provide them with a safe workplace.

The complaint alleged that Facebook had ignored the very workplace safety standards that it had helped to draft, requiring its content moderators to work under “conditions known to cause and exacerbate psychological trauma”.

In a preliminary settlement filed in San Mateo Superior Court, Facebook agreed to pay damages to US moderators. Each of the 11,250 moderators covered by the settlement will receive at least $1,000 and will be eligible for additional compensation if they receive a PTSD diagnosis or other diagnoses related to their work at Facebook, such as depression or addiction. A moderator will receive an additional $1,500 if they are diagnosed with a mental health condition and up to $6,000 if they are diagnosed with multiple concurrent conditions, with the possibility of up to $50,000 in additional damages.

It is estimated that half of the moderators covered by the settlement could be eligible for extra compensation, with the suggestion that the money could be put towards the cost of seeking a diagnosis within the US healthcare system.

The preliminary settlement covers moderators in California, Arizona, Texas and Florida from 2015 to present. These moderators have the opportunity to request changes to the proposed settlement before it receives final approval.

Steve Williams, who represented the plaintiffs, said in a statement: “We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.”

Facebook admitted no wrongdoing as part of the settlement.

However, it agreed to changes to its content moderation tools which could reduce the psychological impact of viewing graphic images and videos, such as by muting audio by default and switching videos to black and white. These tools will be available to 80 per cent of moderators by the end of 2020 and to all moderators some time in 2021.

Facebook has also committed to offering one-on-one coaching sessions with a qualified mental health professional every week, plus monthly group therapy and counselling for workers experiencing mental health crises. Other changes include screening applicants for emotional resiliency and informing moderators about how to report violations of Facebook’s workplace standards.

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone,” Facebook said in a statement. “We’re committed to providing them additional support through this settlement and in the future.”

Pressure exerted on Facebook by lawmakers, legal experts, campaigners and academics since the 2016 US presidential election – in which the influence of disinformation and other inappropriate content on social media became apparent – has forced the social media giant to make a number of other changes to its practices. As well as increasing its content moderation efforts, it has committed to publishing regular Community Standards Enforcement Reports.

The latest report, published this week, revealed that Facebook has detected a sharp increase in posts promoting violence and hate speech. It removed 4.7m Facebook posts related to hate organisations and 9.6m posts containing hate speech in the first quarter of 2020, compared with 1.6m and 5.7m respectively in the fourth quarter of 2019. Facebook attributed the rise in effectiveness to improved technology for detecting text and images.

Facebook also said that it has placed warning labels on more than 50m pieces of coronavirus misinformation and directed more than two billion people to authoritative coronavirus information from health experts.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles