Facebook has agreed to pay $52m (£42m) to content moderators who had sued the social networking giant for post-traumatic stress disorder (PTSD).
The current and former moderators claimed they had developed mental health issues on the job, because of the nature of the material they were reviewing.
According to the Verge, in a preliminary settlement filed on Friday in San Mateo Superior Court, the social network agreed to pay damages to American moderators and provide more counselling to them while they work.
It seems that each moderator will receive a minimum of $1,000 and will be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions.
The settlement, according to the Verge, covers 11,250 moderators, and according to the legal teams involved in the case, as many as half of them may be eligible for extra pay related to mental health issues associated with their time working for Facebook, including depression and addiction.
“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” Steve Williams, a lawyer for the plaintiffs was reported as saying in a statement. “The harm that can be suffered from this work is real and severe.”
The case began in September 2018, when former Facebook moderator Selena Scola sued Facebook, alleging that she developed PTSD after being placed in a role that required her to regularly view photos and images of rape, murder, and suicide.
Scola then allegedly went on to develop symptoms of PTSD after nine months on the job.
Her lawsuit was joined by several other former Facebook moderators working in four states, alleged that Facebook had failed to provide them with a safe workspace.
It should be noted that these moderators were actually hired as contractors via IT services giants such as Accenture, Cognizant, Genpact, and ProUnlimited.
The Verge said that last year it found moderators hired through Cognizant were working in ‘dire conditions’ in Phoenix and Tampa.
Several moderators told The Verge that they had been diagnosed with PTSD after working for Facebook. Later in the year, Cognizant announced that it would leave the content moderation business and shut down its sites earlier this year.
Facebook meanwhile has pledged to make some changes to its content moderation tools, designed to reduce the impact of viewing harmful images and videos. For example audio will be muted by default, and videos will be changed into black and white.
And those moderators who view graphic and disturbing content on a daily basis will also get access to weekly, one-on-one coaching sessions with a licensed mental health professional. Moderators experiencing a mental health crisis will get access to a licensed counsellor within 24 hours, and Facebook will also make monthly group therapy sessions available to moderators.
“We are grateful to the people who do this important work to make Facebook a safe environment for everyone,” Facebook was quoted as saying in a statement. “We’re committed to providing them additional support through this settlement and in the future.”
During the Coronavirus pandemic, social networking giants including YouTube, Twitter and Facebook increasingly relied on artificial intelligence and automated tools to police material posted to their platforms.
CMA receives 'provisional recommendation' from independent inquiry that Apple,Google mobile ecosystem needs investigation
Government minister flatly rejects Elon Musk's “unsurprising” allegation that Australian government seeks control of Internet…
Northvolt files for Chapter 11 bankruptcy protection in the United States, and CEO and co-founder…
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…