Welcome to Everything in Moderation, your weekly newsletter about content moderation on the web, picked and packed by me, Ben Whitelaw.
Today, I’m excited to launch ‘Meet the Moderators’, a series of in-depth interviews designed to give content moderators a voice in the future of online speech. It’s an idea I’ve had for over a year — but I need the support of you, EiM subscribers, to make it happen. Read on to find out more.
I promised last week that I’d round up the content moderation incidents and implications from last week's US election — let me know if there’s anything I’ve missed by replying to this email — BW
📜 Policies - company guidelines and speech regulation
Joe Biden’s election has led to a lot of speculation about what will happen next in terms of moderation and speech regulation in the US.
The most interesting tidbit to have emerged is the online abuse task force, reported by Fortune, that President Biden will launch to better understand the connection between digital threats and real-life harm. Staffed by government officials, legal experts and tech company employees, it will be expected to come up with recommendations to address an issue that was a recurring news line during the Trump administration.
Elsewhere, Wired predicts that dis- and misinformation will seem less urgent because it ‘isn’t so tightly tied to the sitting US president' while Biden's view on Section 230 — the law that provides web services with immunity if a user posts something illegal — is neatly summed up in this LA Times piece as "replace the rule of near-categorical immunity with something more fact-sensitive.”
Elsewhere, Joan Donovan of Harvard Kennedy’s Shorenstein Center plugs the need for greater transparency in this good piece for MIT Technology Review. I’m considering getting one her quotes — 'Fair content moderation decisions are key to public accountability’ — as a tattoo.
💡 Products - features and functionality
It’s all a bit vague but the New York Times reported last week that Facebook and Twitter were planning to add more friction to their products to avoid false information spreading about the election. No specifics were mentioned (just the euphemistic ‘an additional click or two’) and no timelines for when this might happen (or if it would conclude when Trump eventually concedes). Nonetheless, 'friction tech' something I’ve talked about here before (#76 and #79) and will be an important way platforms can ensure the right content goes up and stays up.
💬 Platforms - dominant digital platforms
The dominant narrative since election day, apart from the labelling of dozens of Trump tweets, has been how YouTube failed to avoid voter fraud claims spreading, racked up millions of views, earned thousands of dollars of revenue for the channel and were only pulled down when journalists alerted them to what was happening. Just a normal day then. Remarkably, YouTube told Bloomberg it was ‘generally working as intended’. Casey Newton’s Platformer has the full details.
Facebook has created a new 60-day probation status for Groups found to be sharing misinformation. It comes after 'Stop the Steal’, a group of 325,000 mainly Republican users that was calling for violence in the hours after election day, was pulled down due to its delegitimisation of the election. Groups can be put in probation by Facebook at any time, are not open to appeal and force admins to approve every post. Draconian measures, to say the least.
👥 People - those shaping the future of content moderation
Law.com has a good profile of Cris Armenta, a San Diego lawyer, that is working with 15 conservative YouTubers who claim the platform violated their contractual and constitutional rights when it removed their content as part of an initiative to clear the cite of QAnon videos and other conspiracy theories. Definitely one to watch.
🐦 Tweets of note
- "Alex Jones does the same shit” - Carlos Maza, who came to blows with conservative YouTuber Stephen Crowder back in 2019, notes how Steve Bannon was able to spout mistruths this week by appearing on other people's channels.
- "Sadly social media regulations in Africa are not fighting hate speech & disinformation" - Freelance journalist Samira Salwani on the growing problem of social media regulation in Africa.
- “Pseudonymity is really popular, and “dark” visual aesthetics dominate depression-related hashtags” - Ysabel Gerrard gives an overview her new paper (and a piece for Wired) about depression on Instagram.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.