Kenya: Former META employees share experience ahead of trial

The META logo is seen at the Vivatech show in Paris in Paris, France, Wednesday, June 14, 2023.   -  
Copyright © africanews
AP Photo

Trevin Brownie hasn't forgotten his first day as a content moderator for Facebook, on the premises of a subcontracting company based in the Kenyan capital Nairobi. Hundreds of moderators from various African countries worked in this continental hub, having been recruited for their knowledge of local languages.

"My first video was of a man committing suicide (...) There was a two- or three-year-old child playing nearby. After the man hanged himself, after about two minutes, he realized something was wrong," says the 30-year-old South African, before describing how the child tried to save the man, his father.

"It made me sick (...) It was like nausea and vomiting. But I carried on doing my job," he continues.

Between 2020 and 2023, he watched hundreds of violent videos every day, inciting hatred, and blocked them so that they didn't end up in front of Facebook users' eyes.

He was working in Nairobi for Sama, the Californian company to which Meta, the parent company of Facebook, Instagram and WhatsApp, has subcontracted the moderation of Facebook content for sub-Saharan Africa between 2019 and 2023.

Trevin Brownie says he has seen "hundreds of beheadings" , "organs ripped out of bodies" , "rape and child pornography to the last level", "child soldiers preparing for war" ...

"Humans do things to other humans that I would never have imagined," he says: "People have no idea of the unhealthy videos (...) they are escaping" .

Legal battle

Trevin Brownie is involved in one of the three cases against Meta and Sama, formerly known as Sama source, in Kenya.

Along with 183 former employees, he is contesting his dismissal by Sama, which has announced that it will cease its content moderation activities. They are seeking compensation for salaries that are "insufficient and out of all proportion (...) to the risk to which they were exposed" , as well as for the "damage caused to their mental health".

This legal offensive began when another former Sama content moderator, Daniel Motaung, filed a complaint in May 2022 in a Nairobi court, complaining of "inhumane" working conditions, misleading recruitment methods, inadequate pay and a lack of psychological support.

Meta, which did not wish to comment on the details of the cases, assured that it required its subcontractors to provide psychological assistance, available 24/7.

Contacted, Sama said it was "not in a position" to comment on current cases.

Call centers

Testimonies gathered at the end of April from former Sama content moderators, who are among the 184 plaintiffs contesting their dismissal, confirm the allegations made by Daniel Motaung.

Two of them, Amin and Tigist (first names have been changed), hired in 2019 by Sama, said that they had responded to offers to work in call centers that had been passed on to them by acquaintances or recruitment companies.

They only found out once they had signed their contracts, which included confidentiality clauses, that they were going to work as content moderators.

Amin and Tigist didn't object, or even think about leaving. "I had no idea what a content moderator was, I'd never heard of one," says Tigist, an Ethiopian recruited for her knowledge of the Amharic language.

"Most of us didn't know the difference between a call center and a content moderation center", confirms Amin, who worked on the Somali "market". But for "the group recruited after us, the job offers clearly mentioned content moderation", he points out.

"On the first day of training, before showing us the images, they (the trainers) reminded us that we had signed confidentiality clauses" , he recounts.

"During the training, they played down the content. What they showed us was nothing compared to what we were going to see," he adds: "That's when the problems started".

Trauma

On their screens, for eight hours a day, there was a succession of contents, each more shocking than the last.

"You don't choose what you see, it's random: suicide, violence, sexual exploitation of children, nudity, incitement to violence..." says Amin.

An "average processing time" of 55 to 65 seconds per video is imposed on them, they claim, representing between 387 and 458 "tickets" viewed per day. If they worked too slowly, they were liable to be reprimanded or even dismissed.

For its part, Meta assured in an email that content moderators "are not required to evaluate a set number of publications, do not have quotas and are not obliged to make hasty decisions". "We authorize and encourage the companies we work with to give their employees the time they need to make a decision," it added.

None of the three content moderators interviewed had imagined the effects this work would have on them.

They had not consulted a psychologist or psychiatrist, due to a lack of money, but all said they were suffering from symptoms of post-traumatic stress disorder and had new difficulties in their social interactions or with their loved ones.

Trevin Brownie says he is "scared of children because of the child soldiers, the brutality I've seen children commit", or scared of crowded places "because of all the videos of bombings I've seen" . "I was a party nut," he says: "I haven't been to a club for three years now. I can't, I'm scared".

The lanky Amin, for his part, says he has noticed the effects on his body, which has gone from 96 kg when he started work to "69-70 kg" today.

They all say they have become insensitive to death and horror. "My heart has turned to stone", says Tigist.

Need for money

Meta said it has "clear contracts with each of our partners that detail our expectations in a number of areas, including the availability of individual advice, extra support for those exposed to more difficult content".

"We require all companies we work with to provide 24/7 on-site support with trained practitioners, on-call service and access to private healthcare from day one of employment," the company assured.

According to the content moderators, the support offered by Sama, via "well-being advisors", was not up to scratch. They condemned the vague interviews, with no real follow-up, and questioned the confidentiality of the exchanges.

"It was of no use. I'm not saying they weren't qualified, but I don't think they were qualified enough to manage people who moderate content", says Amin.

Despite their trauma, they stayed because they "needed the money".

On a salary of 40,000 shillings (285,92 US dollars) and a further 20,000 for non-Kenyans. They earned almost three times the Kenyan minimum wage (15,200 shillings).

"From 2019 until today, I never had the opportunity to get another job elsewhere, even though I applied a lot. I had no other choice. That's why I stayed for so long," explains Amin.

Sacrifice

To keep going, the moderators had to find "defense mechanisms", explains Trevin Brownie.

Some used drugs, particularly cannabis, according to the moderators interviewed.

Trevin Brownie, a former comedy fan, immersed himself in horror films. "It was a way of blurring my reality. It allowed me to imagine that what I was dealing with (at work, editor's note) wasn't real even though it was" , he analyses, explaining that he had also developed an "addiction" to violent images.

"But one of our main defense mechanisms was that we were convinced of the importance of the job," he adds: "I felt like I was hurting myself but for the right reasons, (...) that the sacrifice was worth it for the good of society."

"We are the first line of defense for Facebook, (...) like the social network police," he explains, saying in particular that we remove ads for drug sales or "remove targets" on people targeted by death threats or harassment.

"Without us, social networks can't exist," he adds: "Nobody would open Facebook if it was full of offensive content, selling drugs, blackmail, harassment..."

"We deserve better"

"It does damage, and we sacrifice ourselves for our community, for the world. We deserve to be treated better" , agrees Tigist.

None of them would sign up for this job again. "My personal opinion is that no human should do this. It's not a job for humans" , explains Trevin Brownie: "Honestly, I wish artificial intelligence could do this job" .

Despite enormous progress, he doubts that this will be possible in the immediate future.

"Technology plays and will continue to play a central role in our content verification operations", Meta assured.

Until now, none of them had spoken about this work, even to their families. This is because of confidentiality clauses, but also because "no one can understand what we go through".

"If people find out, for example, that I've seen pornography, they'll judge me", explains Tigist.

With her husband, she remained vague about her activity. She has kept everything from her children: "I don't want them to know what I've done. I don't even want them to imagine what I've seen".

Related Stories

View on Africanews
>