Facebook has agreed to pay $52 million to a group of workers who claimed they received little psychological support after they were exposed to distressing content as part of their role.
In what is thought to be the first case if its kind, employees who worked as “content moderators” for companies contracted by the social media giant claimed they had suffered serious mental health consequences as a result of having to review posts that depicted the acts of suicide, child abuse, beheadings, animal abuse, murder and other disturbing content.
Facebook did not admit any wrongdoing, but has agreed to pay content moderators a minimum of $1,000 each, with the potential for additional compensation if they have been diagnosed with a mental health disorder, such as post-traumatic stress disorder, as a result of their work for the firm. It is thought some staff could receive up to $6,000.
The settlement covers any US-based content moderator who has ever worked for a third-party company providing services to Facebook and its subsidiaries WhatsApp and Instagram.
Around 11,250 moderators are eligible for compensation, and their lawyers believe that as many as half of them may qualify for extra pay related to the mental health issues they suffered, including depression and addiction.
“No one had ever seen a case like this, and the jobs that people do were in some ways beyond description,” said Steve Williams, a partner at Joseph Saveri Law Firm in San Francisco, one of the law firms involved in the class action.
“We are so pleased that Facebook worked with us to create an unprecedented programme to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.”
Facebook said it would be providing additional support to staff that undertake this role at third-party firms.
It said in a statement: “We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.”
The case began in September 2018, when former Facebook moderator Selena Scola broke a confidentiality agreement and launched legal action against Facebook, alleging that she developed PTSD after having to review disturbing content. Scola and other content moderators argued that Facebook failed to provide a safe workplace or compensate them for the psychological harms they suffered.
According to tech site The Verge, which first reported the settlement, Facebook said it would make changes to its content moderation tools to help reduce the impact of viewing harmful images and videos. This will include muting audio by default, changing videos to black and white.
Moderators who view disturbing content on a daily basis will get access to weekly one-on-one coaching sessions, while those who experience a mental health crisis will be able to access a licensed counsellor within 24 hours. Monthly group therapy sessions will also be introduced.
Sign up to our weekly round-up of HR news and guidance
Receive the Personnel Today Direct e-newsletter every Wednesday
Third parties contracted by Facebook for this kind of work will also need to ensure job applicants are screened for emotional resiliency; information about psychological support is displayed at each workstation; and that moderators are informed about how they can report violations about the workplace standards set by Facebook.