3 min read

📌 Procedural justice, bulk deleting and problems moderating Israel/Palestine

The week in content moderation - edition #113

Welcome to Everything in Moderation, your weekly newsletter about the policies, products, platforms and people shaping content moderation. It’s curated and produced by me, Ben Whitelaw.

A special hello to new subscribers from Spectrum Labs, Coadec and the Humboldt Institute for Internet and Society. I hope you all enjoy this week's roundup of moderation news and related reading — BW


📜 Policy - company guidelines and speech regulation

The UK's Online Safety Bill (covered in last week's EiM) continues to win less than rave reviews in the aftermath of its draft publication. Mike Masnick on Techdirt says the bill "creates tremendous incentives for excessive censorship and suppression of all sorts of speech to avoid falling on the wrong line" while TechMonitor notes that businesses will face a "huge regulatory burden" to abide by it. Glitch founder Seyi Akiwowo writes in Grazia that there is "still a long way to go". Ten points to EiM subscribers that find anything positive about the bill online from someone that isn't a government minister.

Civic Signals has a great Q&A with Tracey Meares, a Yale University Law professor and co-founder of The Justice Collaboratory, which works to improve justice systems transformation. Meares has recently called for the reform of Facebook's Oversight Board, citing its lack of due process and highlighting the importance of "internalised rule following", which we see when drivers stop at red lights late at night:

"You do it because you think that the law itself was adopted by a process that you deem legitimate. You think that the person who is telling you — this is a law that ought to be obeyed — is legitimate so you obey them and it."

Talking of the Oversight Board, there's been another overturning of Facebook's original moderation decision, this time about a Turkish/Armenian genocide meme. Anyone keeping score?

💡 Product - features and functionality

Clubhouse created "a blocking tool more potent, more consequential, and ultimately more contentious than that of any other social platform", according to this piece by Will Oremus for The Atlantic. The system, which gives undue power to users with large followings and has no appeal process, has "created its own array of opportunities for abuse, tactical silencing, and intimidation". I imagine that's not what Clubhouse's product team was going for.

On the topic of managing unruly users, TikTok has rolled out tools to bulk report or delete up to 100 comments at the same time as well as to block multiple accounts from the same window. Use it wisely, friends.

💬 Platforms - dominant digital platforms

Facebook announced on Wednesday that it had set up a "special operations centre" staffed with native Arabic and Hebrew speakers to "remove content that violates our community standards faster". It comes after it was accused of censoring content (EiM #112) and following revelations via The Intercept that Facebook's rules on the use of Zionist make it difficult to criticise the state of Israel. Evidence, if ever it was needed, that moderation is an act of diplomacy.

A bit of positive platform news now: TikTok has partnered with the Trust and Safety Professional Association (TSPA) to ensure its moderators have access to resources, workshops and events. TSPA, which launched in June last year, is funded for three years by 12 platforms, including Facebook, Pinterest and Automattic, as well as a handful of other annual supporters.

👥 People - those shaping the future of content moderation

A few weeks ago, you might remember that I profiled Zev Burton in this slot for his one-man campaign calling for the release of TikTok's content guidelines (EiM #109). Well, it turns out there are others like him.

In a piece for the Daily Dot, journalist and EiM subscriber Viola Stefanello highlights the work of Abbie Richards (@tofology - 202k followers) and Latinx trans creator Rose Montoya (@rosalynnemontoya - 553k followers) in trying to force the video-sharing platform to make clear how content decisions are made. Work like this by proactive users who want a better online environment for themselves and others often goes unnoticed and I'm glad Viola has brought it to light.

🐦 Tweets of note

  • "We do know that the Board has a minority that would have liked to say more in the enforceable part of the decision" - Law prof and Yale fellow Chinmayi Arun has penned a good thread on the Oversight Board in the aftermath of the Trump decision.
  • "I think there are some really great elements" - Carly Miller, Research Analyst at the Stanford Internet Observatory, has a useful thread on Facebook's new Transparency Centre.
  • "it looks like TikTok really is amping up its censorship of pole - potentially through an “implied nudity” ban" - Dr Carolina Are (previously featured in EiM #101) has a new paper out on  Instagram's shadowban cycle.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.