3 min read

📌 300 pages of rules, Reddit mods go dark and Keller on transparency

The week in content moderation - edition #105

Welcome to Everything in Moderation, your weekly newsletter about content moderation turned around in good time by me, Ben Whitelaw.

This week's newsletter contains themes that will be familiar to longstanding EiM readers — transparency, outsourced labour and the power of moderators — and reminds me that the big online speech questions have barely begun to be answered.

I've steered clear of covering yesterday's big tech CEO hearing (how many Congress appearances is that now?) and instead linked to the best responses in the 'Tweets of note' section of today's newsletter.

Without further ado, here's your content moderation news from this week — BW


📜 Policies - company guidelines and speech regulation

61 rights groups have written a joint letter criticising an EU law designed to force platforms to remove terrorist content within one hour. Human Rights Watch, Amnesty International and others claim that removing such content is impossible using automated systems and will "ultimately result in the removal of legal content about discriminatory treatment of minorities." The legislation will be voted on next month.

I've probably covered the topic of transparency and platform accountability more than any other over the last 2.5 years. (EiM #5, #55, #71). And yet, what transparency means in practice is still vague. And so I'm very glad for this honest piece by Daphne Keller, Director of the Program on Platform Regulation at Stanford, about the need to "get a handle on what transparency reporting is." A must-read.

💡 Products - features and functionality

Facebook Groups have "functioned less as communities than as megaphones for partisan publishers and purveyors of “hate bait"," according to internal reports produced on the back of January's Capital Riots and seen by the WSJ. Facebook will subsequently remove civic and political groups from its recommendations in markets outside the United States. So much for being the 'digital equivalent of your living room'.

Slack's new Connect feature, which allows any user to message across instances, was quickly rolled back after experts criticised the company for not providing a means to black or report harassment. Not ideal but credit to Slack for its swift response on this.

💬 Platforms - dominant digital platforms

The leak of Facebook's 300-page content guidelines this week demonstrates once again how difficult it is to be one of its outsourced moderators. Here's a flavour of the detailed rules and recommendations published by The Guardian:

  • Photoshopped Hitler-style moustache? Fine, social commentary.
  • The 👌🏽 emoji? Always constitutes "praise", apparently.
  • An image of a person from a banned organisation without a caption or any praise? Remove.
  • Calling for the death of a minor local celebrity so long as the user does not tag them into the post? Keep up.

The patience, breadth of knowledge and cultural references needed to do the job are breathtaking. Moderators are heroes.

Substack yesterday published a blog post that further elaborates on how it will moderate content following the publication of its moderation 'philosophy' in December (EiM #95) and ongoing chatter about whether it is platform or publisher. My take is that the blogpost feels defensive and the authors wounded from all the criticism, but I'm interested in what you think. Hit reply and let me know.

YouTube, meanwhile, has announced that videos of staged animal rescues will be banned following a sharp rise over the last two years. Proof that animals suffer at the hands of badly thought through content moderation too.

👥 People - those shaping the future of content moderation

The power of Reddit's moderators seemingly knows no bounds. After forcing Steve Huffman to stand up to white supremacy (EiM #69), mods this week forced the company into a reversal about the hiring of a controversial former UK MP (The story behind it is a maze so, for space reasons, I won't go into it here).

While the debate about volunteer moderation and free labour rumbles on, we can safely say this: Reddit's volunteer mods are holding the company to a standard that, if they were employees or paid via an outsourced company, would be very difficult to do.

🐦 Tweets of note

  • "If content moderation is worth endless Congressional hearings, it's work worth regulating. Time to clean up social media's factory floor" - Foxglove Legal rolls up its sleeves during the big tech hearing yesterday.
  • "Content moderation can't save us" - former FB civic integrity Sahar Massachi points out that talking about takedowns is actually the easy way out.
  • "How bout you start by getting bianca's live-posted murder off your platform bro. And give her fam back her IG pages." - Online abuse and revenge porn lawyer Carrie Goldberg's thread accompanying yesterday's big tech hearing is gold.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.