3 min read

📌 'The tonsils of the internet', Twitch's off-platform push and Zhang interview

The week in content moderation - edition #108

Welcome to Everything in Moderation, your weekly newsletter about content moderation by me, Ben Whitelaw.

The avalanche of moderation news these past few weeks means that today's newsletter is another long one. I've made it as digestible as possible but I'm also considering reverting to a twice-weekly update to make what I send more relevant and timely - what do you think? Is once a week enough?

Drop me an email if you have any specific thoughts or consider sharing a link on social media or supporting via Ko-Fi if you're a regular opener.

For now, here's your single dose of moderation news from the last seven days — BW


📜 Policy - company guidelines and speech regulation

A public policy director on Facebook's global elections team decided not to look into fake engagement allegedly orchestrated by the office of the President of Honduras because she didn’t “feel super strongly” about it, according to a Guardian interview with the former Facebook data scientist Sophie Zhang. Other attempts to tackle 'co-ordinated inauthentic behaviour' were stymied in over 20 countries, Zhang claims, especially where Facebook saw limited PR benefits. The piece and the wider series gives fresh insight into the policy bureaucracy that hinders any well-meaning Facebook initiative and is a must-read.

Mexico looks like it will join other countries governed by centre-right and right-wing parties (EiM #100) by introducing a new bill to regulate social media networks in a way that could severely restrict free speech. I'm starting to lose track.

💡 Products - features and functionality

Social networks have "whole layers of mechanics that enable abuse" meaning that the answer to the current online speech crisis is to remove these mechanics, just as Microsoft did in the early 2000s when it was faced with an explosion in malware: that's the gist of Benedict Evans' latest essay, published this week, on whether content moderation is a dead end. I'm not sure the city metaphor holds though — have a read and let me know what you think.

Adam Tinworth, a good friend of mine and EiM subscriber, uses Ben's post and a chance search for a Bill Gates mug, to ask "Who is analysing the role of Etsy in misinformation spread?". The answer to which is probably "no-one".

💬 Platforms - dominant digital platforms

Now for some important news that I missed last week — Twitch will now ban users who harass members of its community, even when that harassment takes place outside of the platform. As many folks have pointed out, Patreon has enacted a similar policy in the past (see Carl Benjamin, YouTube, 2018) but this is a big step forward in the way users are managed.

Some very interestingly timed news out of Facebook this week: the independent but Facebook-funded Oversight Board will now take appeals over content that has been left up following a review by moderators. It comes just after advocacy group Muslim Advocates brought a case against the platform for misleading Congress regarding the moderation of anti-Islam hate and just before the Oversight Board is due to announce whether Donald Trump will be able to return to the platform.

Elsewhere:

👥 People - those shaping the future of content moderation

Mental health burnout; poorly trained, unsympathetic managers; confusing and controlling NDAs: this week's public anonymous statement by an outgoing Accenture content analyst based in Texas is not surprising. But it is shocking, to me at least, that this is still happening at Facebook two-and-a-half years after Selena Scola brought a case against the platform for exactly the same thing.

This quote, sourced and shared by Buzzfeed's Ryan Mac, in particular, stood out for its bleakness:

"We're the tonsils of the internet, a constantly bombarded first line of defense against potential trauma to the userbase."

There are several constructive suggestions for how to ease the stress of content analysts and properly make use of their skills. Will Facebook listen? Unlikely.

🐦 Tweets of note

  • "The next time a policymaker tells me that tech wizards can surely wizard their way to automated content moderation, I'm going to reply with "Bitche." - Heather Burns takes something positive from the story of the French town caught in Facebook's moderation system.
  • "Well I guess this is one way to do transparency reporting of your content moderation" - David Greene notes Twitter's decision to publicise its banning of conservative 'journalist' James O'Keefe via, er, its Trending topics.
  • "The operations of my whole university (and probably yours) were subject to content regulation by a private company. Today they largely aren't." - Law prof Brian Soucek responds to news that Zoom has handed off content moderation to universities following several recent controversies.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.