Condemned at the end of May in Europe over data protection, the Meta group is under attack on another front in Kenya: the work of content moderators, the behind-the-scenes employees responsible for removing violent and hateful publications from Facebook.
- Three complaints -
In Kenya, three complaints have been lodged against Meta and the Californian company Sama, to whom the group that owns Facebook, WhatsApp and Instagram has subcontracted the moderation of content on its social network for sub-Saharan Africa between 2019 and 2023.
Two were filed by content moderators employed by Sama in Nairobi. Their job was to view and remove from Facebook publications that were violent, inciting hatred or spreading misinformation.
When contacted by AFP, neither Sama nor Meta would comment on the current cases.
An initial complaint was lodged in May 2022 with the Employment and Industrial Relations Tribunal by a South African, Daniel Motaung.
He complained of "inhumane" working conditions, misleading hiring methods, irregular and inadequate remuneration, as well as a lack of psychological support in the face of the trauma caused by this activity. He also claims to have been dismissed after trying to form a trade union.
The case has not yet gone to trial.
In March, a second complaint was filed by 184 other employees claiming to have been wrongfully dismissed by Sama, which has announced that it will cease its content moderation activities. They are seeking compensation for salaries "insufficient (for) the risk to which they were exposed" and "damage to their mental health".
Pending a ruling on the merits, the dismissals were suspended on June 2 by the Employment Tribunal, which ordered Meta and Sama to "provide appropriate psychological and medical care to the plaintiffs".
Meta and Sama have announced their intention to appeal.
Another complaint, in December 2022, accuses them of inaction in the face of hate speech, which, according to the plaintiffs, led to the 2021 murder of a university professor in Ethiopia.
AFP is a Meta partner, providing fact-checking services in Asia-Pacific, Europe, the Middle East, Latin America and Africa.
- Meta's responsibility -
These cases are the biggest on the topic of content moderation since a class action launched in 2018 in the USA. In May 2020, Facebook had agreed to pay $52 million to moderators as compensation for the effects of their work on their mental health.
The complaints filed in Nairobi are aimed at denouncing an outsourcing system which, according to its critics, is being used by Meta to try to evade responsibility.
As with Sama in Nairobi, Meta outsources content moderation on Facebook to companies operating in more than 20 locations worldwide, processing more than two million items daily.