6 min read

Billy Perrigo on investigating Facebook's 'ethical' outsourced content moderation in Kenya

Covering: outsourced content moderation in Africa and the challenges of unionising
Billy Perrigo, TIME journalist
Billy Perrigo, TIME journalist

'Viewpoints' is a space for people working in and adjacent to content moderation and online safety to share their knowledge and experience on a specific and timely topic.

Support articles like this by becoming a member of Everything in Moderation for less than $2 a week.


Content moderation rarely makes it into the headlines of major Western media outlets, especially when it comes to how platforms review content from the African continent. But recently, a story about exactly that was on the front cover of TIME magazine.

Yes, the story was about Facebook — a US-headquartered company with a $490 billion market cap and three billion monthly month active users globally — but it centred on Daniel Motaung, a 27-year-old South African university graduate that moved to Nairobi to work for one of Facebook's moderation partners, 'ethical' outsourcing company Sama. I don't know for sure but I'd bet that it's the first time that a content moderator has been on the front cover of a global magazine.

The reporter that spoke to Daniel and more than a dozen other Sama employees past and present was TIME's Billy Perrigo. As he explains in this recent Tech Empire podcast, he initially set out to find out who was doing content moderation of Ethiopian content (EiM #79) before stumbling across the secretive moderation office in Kenya's capital. It was there he found a workplace full of fear and intimidation and staff who exhibited evidence of post-traumatic stress disorder from viewing graphic and disturbing content. Hearing from Daniel on a recent panel discussion only brought that home.

I reached out to Billy to find out more about how the story developed and what it means for moderators, both in Africa and more widely. He had some interesting things to say, particularly about the reasons Facebook, and its parent company Meta, decided to use third-party outsourcing companies for moderation.

The interview has been lightly edited for clarity.


Can you explain the story in a nutshell and why it matters?

Yeah, sure. So the story, in a nutshell, is that through a so-called ethical AI company, Facebook was operating a content moderation facility in Kenya that nobody knew about. In this content moderation facility, which was dealing with content from all over sub-Saharan Africa, employees were being paid as low as $1.50 an hour to moderate some of the worst content on Facebook. I did an investigation and it found that not only were these some of the lowest-paid workers for Facebook anywhere in the world but also that there was an attempt by those workers to unionize in 2019 which Sama, the so-called ethical AI company, took measures to prevent. In doing so they fired Daniel Motaung the leader of the attempted strike. They fired him and said in his dismantle dismissal letter that he had put Sama's relationship with Facebook at great risk. They say they fired him for bullying, harassing and coercing employees and his colleagues but he maintains that they dismissed him because he was a risk to their business.

Things have moved on since then. What's the latest?

Daniel has filed a case against both Meta and Sama in the Kenyan courts, accusing both of them of multiple violations of the Kenyan constitution, including claims that he was unfairly dismissed, that the workers were employed under practices that amounted to forced labour and that, for the workers who travelled from other countries or that Sama flew from other countries to Kenya to participate in this work, that this amounted to human trafficking.

There are many other allegations in the lawsuit as well including poor working conditions and allegations that Sama and Facebook have not provided adequate mental health care. Sama denies the allegations and so does Meta. Meta is seeking to have its name struck entirely from the suit by arguing that Sama was Daniel's employer, and not employed by Meta.

You spoke to Daniel extensively for the piece. What is your impression of the effect working for Sama on behalf of Facebook has had?

Since the publication of the first story in February, Inside Facebook's African Sweatshop, Daniel has been connected to a therapist through The Signals Network, which is the whistleblower protection NGO that I collaborated with for the story. He has since been diagnosed with severe PTSD and is receiving therapy for that.

I think it speaks volumes that he had not been able to receive such a diagnosis before, even though he was clearly struggling. That was very clear from my conversations with him from the get-go. He is still trying to rebuild his life. He hasn't had another job since being fired from Sama in 2019. Sama was his first job out of university and he took it in the hopes that it would be the first step for him to build a better life for his family and to bring them out of poverty and he says that that now looks less possible than ever before. He is living in pretty dire circumstances.

I mean, honestly, I think people can answer that question themselves after reading the stories.

BECOME A MEMBER
Viewpoints are about sharing the wisdom of the smartest people working in online safety and content moderation so that you can stay ahead of the curve.

They will always be free to read thanks to the generous support of EiM members, who pay less than $2 a week to ensure insightful Q&As like this one and the weekly newsletter are accessible for everyone.

Join today as a monthly or yearly member and you'll also get regular analysis from me about how content moderation is changing the world — BW

The story also gets to the heart of the debate, about the role of artificial intelligence and content, moderation. What have you learned reporting out?

Fantastic question. It's clear from the extent to which this office was secretive, and from the extent that Facebook talks about AI in its quarterly transparency reports, that Facebook's PR puts a huge emphasis on what its AI tools are doing. And it does do lots of automated removals, but what is very clear is that it also relies extensively on human content moderation. And in fact, it's scaling up its human content moderation, not scaling it down. It's incredibly inconvenient for Facebook's narrative that it's one of the world leaders in AI for this very human nature of this labour to be front and centre in the public narrative.

That may explain why it hires its human content moderators by and large through third-party outsourcing companies rather than in-house and it may explain why all of those content moderators have to sign very, very restrictive NDAs that make it incredibly risky for them to speak out and tell our stories.

What's that alternative to outsourcing content moderation? If you work for a platform, what would you do?

Well to the first part of that question, the alternative to outsourcing content moderation is insourcing content moderation. And in fact, that's one of the requests that is made in the lawsuits that Daniel has brought against Sama and Meta is to stop outsourcing it and bring it in-house and give the content moderators the same amount of pay that Facebook's own staff are given for moderating such content. Because Facebook's in-house staff do occasionally need to look at this content and Facebook recognises that that is an incredibly traumatic thing to do and puts on not only high pay but also allegedly far higher quality mental health care provision. Daniel and Foxglove and Nzili and Sumbi Advocates, the firms representing him, argue that insourcing content moderation is a far superior but more costly alternative to outsourcing.

It's difficult because the workforce is splintered and there are NDAs everywhere you look. I think you're correct to say that we're seeing a wave of tech unionisation in the US and I think it's less likely that you will see moderators going out on their own and organising because of the precarious position that they're in. But where we might see change is if we have these kinds of increasingly broad-based coalitions, where richer and privileged tech workers who work in, for example, software engineering realise that building a broad coalition with their less privileged colleagues is actually a more effective way of forcing the companies to act, then simply unionising by themselves.


Want to share learnings from your work or research with 1000+ people working in online safety and content moderation?

Get in touch to put yourself forward for a Viewpoint or recommend someone that you'd like to hear directly from.