Archynetys
—
Campaigners have accused Meta, the parent company of Facebook, of causing “potentially lifelong trauma” to hundreds of content moderators in Kenya. Over 140 moderators have been diagnosed with PTSD and other mental health conditions.
These diagnoses were reported by Dr. Ian Kanyanya, the head of mental health services at Kenyatta National Hospital in Nairobi, and submitted to the city’s employment and labor relations court on December 4.
Legal firm Nzili and Sumbi Associates filed the medical reports as part of an ongoing lawsuit against Meta and Samasource Kenya, an outsourcing company contracted to review content for Meta.
The Role of Content Moderators
Content moderators are tasked with removing distressing content from tech platforms. They are often employed by third-party firms in developing countries, but the psychological toll of their work is a growing concern.
Meta’s Response
Meta declined to comment on the medical reports due to ongoing litigation, asserting that it prioritizes moderator support. Their contracts with third-party firms include provisions for counseling, training, and fair pay.
A Meta spokesperson also mentioned that moderators can customize the content review tool to reduce exposure to graphic material, such as blurring images or displaying them in black and white.
Samasource’s Silence
Samasource, now known as Sama, did not respond to requests for comment on the allegations.
The Extent of the Problem
Dr. Kanyanya described extremely graphic content that moderators encountered daily, including videos of murders, self-harm, suicides, sexual violence, explicit sexual content, and child abuse.
Among the 144 moderators who underwent psychological assessments, 81% were diagnosed with severe PTSD, according to Dr. Kanyanya’s findings.
Retaliatory Dismissals
In 2022, all 260 content moderators in Samasource Kenya’s Nairobi hub were dismissed after advocating for better pay and working conditions, according to a Foxglove report.
The moderators involved in the current lawsuit worked for Samasource Kenya between 2019 and 2023, as mentioned in court documents.
Individual Impact
One moderator reported developing trypophobia—a fear of seeing dotted patterns—after viewing an image of maggots emerging from a decomposing human hand.
The Call for Accountability
Martha Dark, co-executive director of Foxglove, stated that “moderating Facebook is dangerous, even deadly, work that causes lifelong PTSD for nearly everyone who does it.”
Dark contends that if these diagnoses were documented in another industry, those responsible would be compelled to step down and face legal consequences for severe human rights abuses.
A Broader Pattern
This is not an isolated case. Content moderators across social media platforms have taken legal action against their employers, claiming that their jobs caused psychological trauma.
In 2021, a content moderator for TikTok filed a lawsuit, alleging she developed psychological trauma from her work.
A year later, TikTok faced another lawsuit from former content moderators.
Conclusion
The issue of mental health among content moderators remains a critical concern. As the volume of content on social media platforms continues to grow, the importance of ensuring the well-being of the individuals tasked with maintaining community standards cannot be overstated.
What steps should Meta and other tech giants take to address this issue? Share your thoughts in the comments below.