4 min read

📌 Put content moderators on staff, now

The week in content moderation - edition #68

Hello everyone and thanks for finding your way into another Everything in Moderation.

There's no new subscribers to welcome this week but please do share this newsletter with others so a) they can sign up and b) I can give them a shout out here next week. Your support would certainly mean a lot.

If you need a soundtrack for your Friday afternoon, have a listen to Eileen Donahoe, the executive director of the Global Digital Policy Incubator at Stanford University, on Lawfare’s very good Arbiters of Truth podcast.

Stay safe and thanks for reading — BW

PS Do you have 30-45 minutes for an online video call about why you read Everything in Moderation and how it can be improved? Hit reply or book some time in my calendar. A special thanks to Stephanie and Samuel for agreeing to chat with me this week.


🧭 Hire mods, elevate them: NYU report summary

Every day of late has been tough for Facebook but this Monday was particularly rough.

First, an exclusive Guardian article quoted third-party moderators as having witnessed  "an increase in hate speech and racism, not only in our queues but also amongst ourselves” as a result of the platform's decision not to act on Donald Trump's violent speech. Then a new report out of New York University criticised the platform's decision to underfund and outsource its content moderation and suggested it double its resources to 30,000 at whatever cost. Ouch.

Having attracted widespread media coverage throughout the week the report —  ‘Who Moderates the Social Media Giants?', authored by NYU’s Paul M. Barrett — is likely to put pressure on all social networks to reconsider their subcontracting of content moderation to the likes of CPL, Genpact, Accenture and others. It also touches on a lot of issues discussed in this newsletter over the last 18 months.

I recommend reading the 32-page report in full but, for those that don’t have time, I’ve tried to summarise the main points below (with links to past EiMs where relevant):

  • Barrett's central thrust — and one that I agree with — is that outsourcing leads to worker exploitation and, where content moderation is concerned, ‘jeopardises optimal performance of a critical function’ (EiM #41). He argues convincingly, in the case of Facebook, that its $70.7 billion in 2019 revenue is "robust enough to allow the company to achieve better, more humane content moderation”.
  • He focuses on Facebook’s moderation efforts as 'the largest competitor in its segment of the industry’ and 'a trend-setter in content moderation’. However, like others, it refused to provide access to any of its moderation sites for the report.
  • The solution to a rise in hate speech, violence and fake accounts is three-fold, according to Barrett (who is also the assistant managing editor at Bloomberg):
  1. Bring content review closer to the core of their corporate activities
  2. Increase the number of human moderators (even while continuing to refine AI screening software)
  3. Elevate moderators’ status to match the significance of their work
  • Barrett also includes eight practical recommendations for Facebook to follow, including hiring a moderator-in-chief to report to Mark Zuckerberg and expanding the number of countries that have local moderators.
  • Over 32-pages, the author also touches on the Oversight Board (EiM #63), Arun Chandra’s role (EiM #27), the failings of supply-chain moderation (EiM #57), the prevalence model discussed in Facebook’s February white paper (EiM #52) and why the kind of content regulation legislation seen in Germany (EiM #13) and France (EiM #29) is not the route to go down in the US.

Have you read the report? I'm interested to hear from you about whether ending the outsourcing of content review is a) the right way to go and b) realistic. Reply and let's continue the conversation.

💲 You can take your money

If Facebook's number crunchers are worried about the costs of Barrett’s plan, they need not fear — US non-profits are beginning to decline its cash because of its policy on content moderation.

This week, the Open Technology Institute — the technology program of the New America think tank — said it would no longer take funding from Big Blue as part of a commitment 'to hold companies and ourselves accountable'.

What dent will that make? Well, in 2019, Facebook contributed $300,00 to OTI, according to the charity’s website, including $50,000 to its Wireless Future project. 27 organisations contributed more than $1million to New America during the same period.

Public Knowledge, the US non-profit focusing on freedom of expression and internet policy, also followed suit. It received '$25,000+' in 2018/2019, according to its records.

These organisations, which shape US policy and have a stake in deciding how social networks are regulated, will miss the money but their influence will also be a loss to Facebook too. Maybe that will lead to a change in its content guidelines?

⌛️ Not forgetting...

Over 500 subreddits have written an open letter to Reddit CEO Steve Huffman with a list of demands to address hate speech on the platform.

500+ Reddit Communities Demand the Site Take Real Action to Fight Racism | PCMag

'These continued statements that you hear us, that this is a priority, or that you are working on it are not enough,' moderators write in an open letter to Reddit CEO Steve Huffman.

See also: Casey Newton over at The Verge about why recent weeks have been taxing for volunteer moderators. I'll hopefully have a special Q&A on this next week.

Filing this under ‘we’re all moderators now’: A popular NZ Facebook group with 150,000 members has been criticised for 'censoring content related to racism’.

The Vic Deals community is imploding over claims of racism and hypocrisy | The Spinoff

With more than 151,000 members, Wellington's Vic Deals is one of the largest community Facebook groups in New Zealand. But in the last few days, the group’s team of administrators have landed itself in hot water after being accused of censoring content related to racism, colonisation, and Black Live

Joe Biden doesn’t let knowing zilch about moderation (see EiM #48) stop him calling for changes to Facebook’s moderation policies.

Joe Biden Takes Aim at Facebook's Moderation Policies - Bestgamingpro

Twitter took down over 30,000 accounts related to three countries (China, Russia and Turkey). Stanford Internet Observatory was shared the data and have written some interesting analysis.

Analysis of June 2020 Twitter takedowns linked to China, Russia, and Turkey

Finally, you might want to think twice about shaving your hair off, even if your lockdown barnet is bad. Facebook reportedly banned over 200 skinhead users, reportedly for thinking that they were linked to far-right groups. A bald move, indeed.

Facebook Deplatforms Hundreds of Anti-Racist Skinheads and Musicians

'They wanted to see my ID before they would give me my account back.'


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.