3 min read

Volunteer mods, auto-deleting #EndSARS and Eli on digital spaces

The week in content moderation - edition #85

Welcome to Everything in Moderation, your weekly newsletter about content moderation on the web, handcrafted by me, Ben Whitelaw.

If you were playing moderation bingo this week, you would’ve got a full house (It’s also why I’m postponing revealing more about my ‘Meet the moderators’ series). Here’s a recap of the last seven days — BW

📜 Policies - company guidelines and speech regulation

Section 230 — the US law that affords internet companies specific protections — was at the heart of the Wednesday's US Senate committee hearing attended by the CEOs of Twitter, Facebook and Google.

Just like the last big tech hearing in July, all three took a hammering from across the political divide about their respective approaches to content moderation without a great deal of agreement about what comes next for (Repeal of Section 230? Reform? Some kind of conditional cover?). It gave the sense, as Shira Ovide at the New York Times noted, of being ‘not the hallmark of a serious exercise in policy-making’.  If nothing else, just enjoy the beard-off between Jack Dorsey and Ted Cruz.

Props to The Chicago Maroon, the wonderfully named student newspaper at the University of Chicago, which has expanded its comment moderation policy to give readers ’the opportunity to discuss our work’. All comments will now be reviewed by a human moderator (it can be done) and threads will close after 48 hours as standard. Sensible changes, I’d say.

💡 Products - features and functionality

A 'technical system error’ in Facebook and Instagram’s content matching service meant that a photo of a Nigerian flag drenched with blood shared as part of #EndSARS protest was marked as false and removed.

I’ve written before about the ’technical glitch’ excuse (EiM #67) and this is just another example. But, more than that, it's a reminder of how little we know about the state of Facebook's moderation efforts in Africa, where it has a reported 200 million active users.

💬 Platforms - dominant digital platforms

Vice went undercover at ‘RClub’, the group of volunteer reviewers working for porn site xHamster, and found a lax system of moderation where underage content is waved through and voyeurism is permitted. Somehow not at all shocking.

Two former moderators that worked on behalf of Facebook have come forward to allege that the platform underpays and mistreats staff. In doing so, Viana Ferguson and Alison Trebacz join a host of other former mods — including Selena Scola and Chris Gray — who have spoken out for Facebook's failure to protect them and colleagues from viewing distressing content. Not that the platform has learnt from these lessons - just this week, it signed a four-year deal with third-party contractor Covalen to moderate content out of its Dublin office.

Finally for this section: will the last platform that bans QAnon please turn out the light? This week, it was Patreon, which banned ‘a small number‘ of creators that continue to push the conspiracy theory.

👥 People - those shaping the future of content moderation

Eli Pariser has been thinking deeply about inclusive digital spaces for a long time and this recent piece in Wired — in which he uses Fort Greene Park in Brooklyn to reflect on online communities — reminded me of his work. I wrote about his co-founding of Civic Signals — a community of researchers and designers — in this piece about public vs non-public space (EiM #43). With the US election just days away, it’s more relevant than ever.

Finally, Ankhi Das, the head of Facebook India’s public policy director at the centre of the Wall Street Journal story about preferential treatment of India's ruling politicians (EiM #78), has this week quit to ‘pursue her interest in the public service’. Not exactly surprising after the backlash she and the company received but it will put pressure on Ajit Mohan, her boss and the head of Facebook India, who I featured in a recent EiM.

🐦 Tweets of note

  • "I want to take a moment to say that understanding the tribulations of content moderation takes a library, so read down the thread" - Jillian C York coincides the release of her book with a useful thread of resources.
  • "I am SO FASCINATED in the line between ‘editing’ and ‘moderation’"- journalism professor Emily Bell reflects on how Substack might deal with content moderation issues.
  • "While quarantine/bans can disrupt recruitment, they just displace the core group elsewhere" - From last week but still good: J Nathan Matias explains, through the lens of r/The_donald why banning isn't a panacea.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.