Hundreds of former Facebook content moderators in Kenya have accused Meta of causing “potentially lifelong trauma,” with over 140 diagnosed with post-traumatic stress disorder (PTSD) and other mental health conditions. The diagnoses were submitted to Nairobi’s Employment and Labor Relations Court on December 4 as part of a lawsuit against Meta and its outsourcing partner, Samasource Kenya (now Sama).

The Allegations

  • Graphic Exposure: Moderators reported exposure to disturbing content, including violent murders, suicides, child abuse, and explicit sexual material. This led to severe psychological distress, according to Dr. Ian Kanyanya of Kenyatta National Hospital, who assessed the moderators.
    • Findings: Of 144 moderators assessed, 81% were classified as having “severe” PTSD.
  • Testimonies: Moderators described vivid nightmares, flashbacks, and conditions like trypophobia (fear of dotted patterns) caused by graphic imagery.
  • Mass Redundancy: In 2022, 260 moderators at Sama’s Nairobi hub were laid off after raising concerns about working conditions and pay.

Meta and Sama’s Responses

  • Meta: While declining to comment on the case, the company stated it takes the well-being of moderators seriously and ensures contracts include provisions for counseling, training, and fair pay. Moderators also have tools to blur or desaturate graphic content.
  • Sama: The outsourcing firm did not respond to requests for comment.

Legal Proceedings and Historical Context

  • The lawsuit follows a 2022 case by a former moderator, who claimed unlawful termination after organizing protests against working conditions.
  • Legal firm Nzili and Sumbi Associates, representing the moderators, has emphasized the role of Meta and Sama in failing to protect workers from the mental health risks of their jobs.

Broader Implications

This case highlights ongoing concerns about the mental toll of content moderation, a role essential to maintaining social media platforms but often outsourced to low-cost labor markets. Foxglove, a UK-based nonprofit supporting the case, criticized Meta for outsourcing harmful work while ignoring its human impact.
Foxglove’s Martha Dark stated, “If this level of trauma occurred in any other industry, accountability measures would be swift and severe.”

Context of Similar Cases

  • TikTok: The platform faced lawsuits in 2021 and 2022 over claims of psychological trauma experienced by moderators.
  • Industry-Wide Issue: Content moderation roles across tech companies are increasingly under scrutiny for their impact on mental health.

As this legal battle unfolds, it underscores the ethical and legal responsibilities of tech giants to safeguard their workers’ mental health. The court’s decision could set a precedent for accountability in the content moderation industry.