Nearly 200 former content moderators for Facebook are suing the company and a local contractor in a court case in Kenya that could have implications for the work worldwide. Photo: MGN Online

Meta, Facebook’s parent company, is facing a $1.6 billion lawsuit from content moderators in Kenya over poor working conditions, insufficient mental health support and low pay for moderators.

The content moderators told the Associated Press that their job required them to watch horrific content for eight hours a day that would overwhelm many of them while being paid 60,000 Kenyan shillings, or roughly $414 U.S. dollars a month.

They accused Sama, a San Francisco subcontractor that describes itself as an ethical AI company, of doing little to ensure post-traumatic professional counseling was offered.

The suit was brought by 184 moderators from several African countries.

Advertisement

The job entailed screening user content in 12 African languages and removing any uploads deemed to breach Facebook’s community standards and terms of service.

Earlier this year, Sama, which left the content moderation business, laid them off.

The lawsuit in Kenya is the first known court challenge of its kind outside the United States. In 2020, Facebook agreed to pay $52 million to U.S. content moderators who filed a class action lawsuit after they were repeatedly exposed to beheadings, child and sexual abuse, animal cruelty, terrorism and other disturbing content.

Under the terms of the deal, more than 10,000 content moderators who worked for Facebook from sites in four states will each be eligible for $1,000 in cash. In addition, those diagnosed with psychological conditions related to their work as Facebook moderators can have medical treatment covered, as well as additional damages of up to $50,000 per person.

The African moderators expressed despair as money and work permits run out and they wrestle with the traumatic images that haunt them.

“If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, ‘Is this okay to be here?’” Nkunzimana, a father of three from Burundi, told The Associated Press in Nairobi.

The 33-year-old said content moderation is like soldiers taking a bullet for Facebook users, with workers watching harmful content showing killing, suicide and sexual assault and making sure it is taken down. 

For Nkunzimana and others, the job began with a sense of pride, feeling like they were “heroes to the community,” he said.

But as the exposure to alarming content reignited past traumas for some like him who had fled political or ethnic violence back home, the moderators found little support and a culture of secrecy.

They were asked to sign nondisclosure agreements. Personal items like phones were not allowed at work. Facebook and Sama have defended their employment practices in June.

Meta said its contractors are officially obliged to pay their employees above industry standard rates where they operate and provide on-site support by trained practitioners. A company spokesman said Meta could not comment on the Kenya case.

In an email to the AP, Sama said the salaries offered in Kenya were four times the local minimum wage and that all employees had unlimited access to one-on-one counseling.

In countries like Kenya, where there is plenty of cheap labor available, the outsourcing of such sensitive work is “a story of an exploitative industry predicated on using global economic inequity to its advantage, doing harm and then taking no responsibility because the firms can say, ‘Well, we never employed so-and-so, that was, you know, the third party,” Sarah Roberts, an expert in content moderation at the University of California, Los Angeles, told AP.

The Employment and Labor Relations court said they have 21 days to reach a settlement and that former Kenyan chief justice, Willy Mutunga, and labor commissioner, Hellen Apiyo would co-mediate the dispute. (GIN)