Meta Facesโ Lawsuit: Kenyan Moderators Allege โSevereโค PTSD fromโ Graphic Content
A important legal battle is brewing โคagainst meta, the parent company of facebook.Over 140 Kenyan content moderators are โsuingโ the tech giant, alleging severe psychologicalโค trauma stemming from their work reviewing graphic content. The lawsuit, filed in Nairobi,โฃ claims thatโ prolonged โคexposure to violent and disturbing material led to widespread โdiagnoses ofโฃ Post-Traumatic stress Disorder (PTSD),โฃ anxiety, โฃand depression.
The moderators, employed by Samasource kenya (now Sama), a third-party contractor for Meta, were tasked with filtering harmful content from the Facebook platform. Their work,โ according to medical reports filed with the court, involved dailyโ exposure to “extremely graphic content,” as described by Dr. Ian Kanyanya, head of mental health services atโข Kenyatta National Hospital. Dr. Kanyanya’s assessment revealed that 81% of the 144 moderators who volunteered for psychological evaluations suffered fromโ “severe” PTSD.
โExtremely graphic content onโข a daily basis which included videos of gruesome murders,โข self-harm, suicides, attempted suicides, โฃsexual violence, explicit sexual โคcontent, child physical and sexualโ abuse, horrific violent actions just to name a few,โ Dr. Kanyanya stated in his report. The sheer volume and intensity of this material, the lawsuit argues, caused significant and lasting mental health damage.
The lawsuit, filedโค on December 4th, โ2023, is a class action representing โค185 moderators. It highlights concerns about โฃthe ethical implications โof outsourcing such โฃemotionallyโ taxing work toโฃ developing countries, frequentlyโฃ enough with less stringent worker protections. while โขMeta declined to comment directly on the medicalโฃ reports due toโ the ongoing litigation, thay stated that they takeโ the well-being of their moderators seriously and that their contracts with third-party firms includeโ provisions for counseling, training, and fair compensation. they also noted that moderators have โฃtools to customize their content review experience, such as blurring or desaturating graphic images.
This case raises critical questions โabout corporate responsibility and the mental health impacts ofโ moderatingโข online content. The high prevalence of severe PTSD among these moderators underscores the need for greater protections and supportโ for individuals performing this essential, yet emotionally demanding, work. The outcome of this lawsuit could have significant implications for the techโข industry โand its approach to content moderation โglobally.
Former Facebook Moderators Sue, Claiming Trauma and Unjust Dismissal
A new lawsuit filed by former Facebook content moderatorsโ in Kenya alleges severe psychological trauma and unlawful dismissal after they protested unsafe working conditions. The case, supported by the UK-based non-profit Foxglove, highlights theโ devastating impact of content moderation on mental health and raises serious questions โabout corporate responsibility.
The lawsuit, launched in 2022, centers โon the experiences ofโ moderators employed by Samasource Kenya between 2019 โขand 2023. According to Foxglove, all 260 moderators at Samasource Kenyaโs Nairobi hub wereโข made redundant last year, a move they describe as “punishment” for raising concernsโข about pay and working conditions.
Court documents detail the harrowing experiences of these moderators. One medical record, โobtained by CNN, describes a moderator waking up in “cold sweats from frequent nightmares” directly related to the graphic content โขthey reviewed. This led toโ “frequent breakdowns, vivid flashbacks, and paranoia.”
Another former moderator recounted developing trypophobia โ a fear of clusters of small holes โ after viewingโค an image of maggots on a decomposing hand.These โคaccounts paint a stark picture of the psychological toll exacted by the job.
โModerating Facebook is dangerous, even deadly, โฃwork that inflicts lifelong PTSD โฃon โalmost everyone who moderatesโข it,โ said Martha Dark, co-executive โคdirector of Foxglove. โIn Kenya, it traumatized 100%โ of hundreds of former โขmoderators tested for PTSDโฆ Facebook is responsible for the potentially lifelong trauma of hundreds ofโ people, usually young โpeople who have only just finished their education.โ
dark further argued that if these diagnoses occurred in any other industry, those responsible would face “legal consequences for mass violations of peopleโs rights.” โ This lawsuit is notโ an isolated incident; similar legal actions have been filed against otherโข social media giants.
In 2021, a TikTok content moderator โฃsued the platform for psychological trauma, followed byโค another lawsuit in 2022 involving multiple former moderators.These cases underscore a growing concern about the mental health consequences faced by individuals tasked with โฃpolicing online content.
This lawsuit serves as aโค criticalโ reminder โof the human cost behind the curated onlineโ experience.โ โข It highlights the urgent need for social media companies to prioritize โฃthe well-being of their content moderators and implement robust support systems to mitigate the risks associatedโฃ with this demanding work.
For more news and to subscribe to CNN newsletters, visit CNN.com.