Welcome to Africanews

Please select your experience

Watch Live

News

news

Ex-Facebook content moderator in Kenya sues Meta over poor working conditions

Ex-Facebook content moderator in Kenya sues Meta over poor working conditions
Photo taken on October 18, 2021 in Moscow shows the US online social media and social networking service   -  
Copyright © africanews
KIRILL KUDRYAVTSEV/AFP or licensors

Kenya

A former Facebook content moderator from Kenya on Tuesday filed a lawsuit against Facebook’s parent company, Meta.

The former employee has accused Meta of poor working conditions, including irregular pay and insufficient mental health support for contracted content moderators.

The ex-Facebook content moderator worked for the company through Sama, a local outsourcing agency.

The complainant, Daniel Motaung claims that Sama fired him shortly after he tried to form an employee union.

According to the Guardian news reports the first video that Motaung moderated was a beheading.

Daniel claims he has severe Post-traumatic stress disorder and his pay isn’t enough to cover his mental healthcare.

“I have been diagnosed with severe PTSD (post-traumatic stress disorder),” Motaung told Reuters. "I am living ...a horror movie."

Motaung’s lawyers said that Meta and Sama created a dangerous and degrading environment where workers were not given the same protections as employees in other countries.

The lawsuit filed on behalf of a group seeks financial compensation.

In response, a meta spokesperson told Reuters "We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading pay, benefits and support. We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect.”

Sama on the hand would not comment until seeing proof of the lawsuit. Sama has previously rejected claims that its employees were paid unfairly, that the recruitment process was opaque, or that its mental health benefits were inadequate.

Globally, thousands of moderators review social media posts that could depict violence, nudity, racism or other offensive content. Many work for third party contractors rather than tech companies.

This is not the first time Meta is facing such a suit.

Last year, a California judge approved an $85 million settlement between Facebook and more than 10,000 content moderators who had accused the company of failing to protect them from psychological injuries resulting from their exposure to graphic and violent imagery.

Facebook did not admit wrongdoing in the California case but agreed to take measures to provide its content moderators, who are employed by third-party vendors, with safer work environments.

View more