3 min read

๐Ÿ“Œ Money vs moderation, Facebook's 'emergency' and #releasetheguidelines

The week in content moderation - edition #109

Welcome to Everything in Moderation, your weekly newsletter about content moderation by me, Ben Whitelaw. New subscribers from Active Fence, Khoros, DCMS and elsewhere โ€” I hope you enjoy your first newsletter.

Last week (EiM #108), I asked you whether a twice-weekly update would help you keep up with the maelstrom of content moderation news. The response was clear: we're already overwhelmed, please keep it as it is. I will do just that.

So here it is, your regular weekly roundup of content moderation news. Hit share or support if you enjoy today's edition โ€”ย BW


๐Ÿ“œ Policy - guidelines, research and regulations

The number of experienced community moderators involved in Twitch streams is a good predictor of positivity and inclusivity, according to a new piece of research from the Anti-Defamation League's Center for Technology and Society (CTS). Its researchers reviewed four high-profile live streaming events occurring between October 2020 and February 2021, including the famously brilliant AOC playing Among Us, expecting to find examples of harassment and antisemitism. But, thanks to mods, abuse was in short supply.

๐Ÿ’ก Products - features and functionality

Roblox, the widely popular game-building metaverse, will be introducing a content rating system and clearer ways to find parental controls following concerns about sexual content that surfaced back in November 2020. It comes just a month after the platform went public with a $45billion valuation.

"Dollars flow to live audio as moderation problems loom": this Axios headline says it all for me. We are walking straight into a full-blown audio moderation crisis (see Tweets of note below), fuelled by VC money and a collective memory loss. Pretty grim.

๐Ÿ’ฌ Platforms - dominant digital platforms

TikTok has banned the #dontsearchthisup hashtag after it became a way for users to signal to accounts that were posting porn and violent videos via their profile pictures. According to the BBC, some accounts racked up thousands of followers and were even featured on the app's For You page. Repeat after me: people will always find ways to break the rules

Facebook's announcement of special measures for Derek Chauvin's trial was interesting, particularly the classification of Chauvin as a public figure and plans to limit the spread of content in "emergency situations". But it raises a bunch of additional questions: How long will this special treatment last? What is the criteria for these measures? And, as the LA Times says, why not always?

๐Ÿ‘ฅ People - those shaping the future of content moderation

Zev Burton is a man on a mission: he wants TikTok to release their full content moderation guidelines. The Georgetown University student and soon-to-be-published author has been posting on the platform for 89 days and counting with the hashtag #releasetheguidelines. In doing so, he has built up a following of over 120k followers.

His videos are light-hearted (yesterday he wore a crop top) but the message is a serious one and his Change.org petition is fast approaching 10,000 signatures. But will the platform take notice?

PS: If you want to be someone that has a big sway on the future of the trust and safety space, The Digital Trust & Safety Partnership (DTSP) is looking for a founding executive director. US-based sadly but an exciting role with a lot of scope.

๐Ÿฆ Tweets of note

  • "How much longer does Clubhouse get away with its current level of content moderation?" - political science professor Brendan Nyhan rightly baulks at the anti-semitism swirling around everyone's sexiest new audio app (Clubhouse announced that it had shut down the rooms after it was discovered).
  • "The Big Name Substackers Discord (Sidechannel) might be about to discover that itโ€™s very challenging to manage communities at the scale some of them have." - media strategist (and EiM subscriber) Adam Tinworth notes how the poachers, in a way, have become gamekeepers.
  • "why i think most people & platforms are approaching content moderation all wrong" - Tracy Chou, Block Party app founder, has written a great piece for Wired on her vision for tackling online toxicity.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.