African tech workers press global social media giants for better conditions

Mophat Okinyi, once a content moderator for ChatGPT, is seeking to unionize his former colleagues to improve their pay and conditions.

Carlos Mureithi

July 12, 2023

Mophat Okinyi hates remembering the job he used to do for ChatGPT.

For about $1.50 an hour, he read hundreds of descriptions of pedophilia and incest for the  artificial intelligence platform every day. As a quality analyst, his job was to confirm that his subordinates had read and classified potentially harmful content correctly.

"Some of these things are very shocking," says Mr. Okinyi. "We shouldn’t even talk about some of these texts."

Why We Wrote This

Tech giants like Facebook and YouTube have been accused of taking advantage of weaker labor laws in Africa. Now content moderators are using legal avenues to win fair pay and support.

The work left him so traumatized, he says, that he drifted apart from his family and eventually separated from his wife. “If you put so much dirty content in your mind, it changes you,” he says.

Now, Mr. Okinyi and some 150 other content moderators working for tech powerhouses, including Facebook and TikTok, hope to form a union to improve their pay and working conditions.

What happens if Trump tries to overturn another election loss?

“We're trying to make this job safe for those who will do it in future and those who are doing it right now,” says Mr. Okinyi.

Their decision to unionize shines a spotlight on the way tech giants use human labor in Africa where, through outsourcing, they hire hundreds of people to remove harmful content from their platforms. African content moderators hope to force tech corporations to provide adequate mental health care and fair pay for everyone who works for them – including non-traditional employees such as themselves. 

Current and former African content moderators working for global tech companies vote to form a trade union, May 1, 2023.
Courtesy of Foxglove

“Unionization signals that gig work rights are labor rights, and workers deserve the protections provided by law in this field,” says Nanjira Sambuli, a Nairobi-based tech and international affairs fellow at the Carnegie Endowment for International Peace.

Since last year, Facebook’s parent company, Meta, has been facing lawsuits brought by content moderators in Kenya accusing it of union busting, wrongful terminations and insufficient psychological support, among other infringements.  In one case, Meta claimed it was not the moderators’ employer, and was therefore not liable. The court ruled Meta was the “true employer” and the “owner of the digital work of content moderation.” 

“That’s the most significant labour rights decision about content moderation I have seen from any court anywhere,” says Cori Crider, co-founder and director of Foxglove, a London-based non-profit that’s providing legal advice to the moderators. “If Facebook is held the true employer of these workers, then the days of hiding behind outsourcing to avoid responsibility for your critical safety workers are over.”

Harris vs. Trump: Where they stand on the big issues

But the decision isn’t set in stone yet, as Meta awaits the outcome of an appeal.

Outsourcing responsibility 

For years, tech platforms have faced intense criticism around the world for failing to filter divisive content. In Africa, Facebook came under fire last year for alleged inaction over hateful material that eventually incited violence during the war in northern Ethiopia. A study last month by Global Witness found extreme and hate-filled ads were approved by YouTube, Facebook and TikTok in South Africa, where xenophobic violence has flared up in recent years.

As a result, tech powerhouses have invested heavily in removing material including hate speech, misinformation and incitement to violence from their platforms. Many of the workers who undertake this vital but gruelling task are hired through outsourcing companies and are based in countries like Kenya, India and the Philippines, which supply quality labor at cheap prices.

In Nairobi, a regional tech hub, outsourcing companies bring talent from numerous African countries to moderate work in different African languages. They include Sama, a San Francisco-headquartered company, which has contracted workers for Facebook and ChatGPT. Majorel, headquartered in Luxembourg, hires labor for Facebook and TikTok.

Signs and sticky notes, at the meeting when current and former content moderators voted to form a union, suggest names for the new organization.
Courtesy of Foxglove

Global tech companies believe that by outsourcing, they can escape responsibility, says Odanga Madung, a senior researcher at the  Mozilla Foundation in Nairobi, whose work focuses on the impact of tech platforms in Africa.

“Irresponsibility has always been good business in the capitalist contexts," he says. “Taking care of people is expensive, more so if you're exposing them to graphic content on behalf of your users.”

Accusations of exploitation of content moderators are not unique to Africa. Moderators in the US and Ireland have in the past sued Facebook for mental health issues related to their work. In Germany, a Berlin-based trade union called Verdi has recently been helping content moderators for TikTok and Facebook to unionize. 

But the move by African workers to form a union is a novel approach outside the West. At the top of members’ list is having regular, professional mental health checkups, and having their pay standardized with those of their peers across the world.

After graduating from university, Mr. Okinyi joined Sama in Nairobi in 2019 for his first job. He worked on various projects for different foreign tech companies, doing data labeling, product classification and other tasks. But it was his content moderation work for ChatGPT, starting in 2021, that affected him in unforeseen ways.

ChatGPT is a chatbot that takes in a user’s question then, using a language model created from words from the web, provides an answer. Critics say its reliance on mining the internet makes it vulnerable to toxic material.

For the six months that he worked on ChatGPT, Mr. Okinyi’s work began early, ended late, and left him emotionally drained. 

Every day at work, he read some 700 texts about child sexual abuse and flagged them according to their severity. Over eight-hour shifts of reading and labeling this material, he enabled ChatGPT to filter out harmful requests.

Although Sama provided counselors, Mr. Okinyi says, productivity demands at work meant he and other workers barely had time to see them.

OpenAI, ChatGPT’s developer, didn’t respond to a request for comment.

Kauna Ibrahim, a former content moderator for Facebook, poses for a photo in Nairobi, Kenya, June 13, 2023.
Carlos Mureithi

The situation was equally appalling for Facebook content moderators at Sama. Kauna Ibrahim, a Nigerian, spent four years watching hundreds of horrific videos every day at work, including sexual abuse and beheadings. For roughly three dollars an hour, she assessed whether the videos were in violation of Facebook’s policies.

During her first year of work, she began suffering panic attacks.

“Some of the images never leave you. You find yourself unable to sleep. Sometimes you dream of what you have seen,” says Ms. Ibrahim, who was a graduate student in clinical psychology at the time. “But because you do it every day, you just survive.”

Sama’s therapists weren’t qualified and didn’t provide enough psychological support, Ms. Ibrahim says. So she resorted to seeking her own therapist.

Sama says it provides “qualified and licensed” professionals to provide therapy for its workers, and that it uses an “internationally-recognized” methodology to set wages for its workers, making its pay “internationally comparable and locally specific.”

Meta declined to comment because of the ongoing lawsuits.

Ms. Ibrahim was among 260 workers whose contracts were terminated in March 2023 after Sama stopped doing work for Facebook. Sama’s work for ChatGPT ended in March 2022, and Mr. Okinyi later moved to Majoral to do customer service work for a European e-commerce company. 

On Labor Day this year, both Mr. Okinyi and Ms. Ibrahim sat alongside about 150 other content moderators for Facebook, TikTok and ChatGPT in a Nairobi hotel, and voted to unionize. Mr. Okinyi understands the fight may not be over, but he’s willing to keep pushing with his colleagues to ensure their voices are heard.

“We want to be united because if we're united, we become strong,” he says.