Welcome to Everything in Moderation, your weekly newsletter curated and produced by me, Ben Whitelaw.
A warm welcome to folks from EU Disinfo Lab, Carnegie Endowment for International Peace and others doing important work in the online speech space. The newsletter hits your inbox every Friday with all the latest news and must-read analysis so keep an eye out for it.
Onto this week's edition... — BW
📜 Policies - company guidelines and speech regulation
A story that has only got picked up in the last 24 hours or so: Twitter deleted a tweet from Nigerian President Muhammadu Buhari calling for those who fought in the Civil War to treat "misbehaving" young people "in the language they understand". The 12-hour posting suspension angered government ministers, who have said the platform has "double standards". Shades, here, of what is going on between Twitter and the Indian government right now.
Platformer's Casey Newton pulls together several threads that I've highlighted in EiM over the last few weeks to suggest that censorship of minority groups — see Gaza and India — has become the new crisis for social networks. There's a little irony in journalists like him calling for fewer takedowns, he notes, but:
I still see a mostly coherent story: one of platforms that helped enable the rise of authoritarians, only to see those authoritarians use their newfound power to crack down on dissent.
Along the same lines, Harvard law lecturer evelyn douek warns in a new piece for Wired that "more content moderation isn’t always better moderation, and there are trade-offs at every step". Covid-19 labels, Holocaust denier de-platforming, media outlets being downranked and imminent regulation across the globe means "the internet is sitting at a crossroads, and it’s worth being thoughtful about the path we choose for it." It certainly feels notable to me.
💡 Products - features and functionality
The burden on volunteer moderators looks set to increase as platforms and products (rightly or wrongly) pass ownership of online spaces down to users. EiM subscriber and cofounder of Crowdstack Rosemary O'Neill recently wrote a piece for Hackernoon about the move to "more manageable, human-friendly independent communities."
I've been keeping an eye out for case studies of volunteer-led spaces and came across this BBC profile of The Motherload, a 107,000 people-strong Facebook group founded in 2015. The article details the support it provided for mums during the pandemic as well as the influx of posts — 800 plus several hundred inbox messages a day — that its 14 volunteers moderators had to deal with. Founder Kate Dyson also notes that she spends 10 hours a day (!) managing the group for little financial reward. That will have to change.
💬 Platforms - dominant digital platforms
- Almost 200 employees signed an open letter calling for Facebook to launch an internal task force to “investigate and address potential biases” in both its human and automated content moderation systems, according to The Financial Times.
- Facebook-owned Instagram announced changes to the way it ranks and displays original content vs reshared after activists believed the platform was suppressing perspectives and topics.
Elsewhere, in news that may not be all that surprising, TikTok still has a boogaloo problem, according to a report by education platform Shout Out UK. You'll remember Facebook banned 200+ boogaloo accounts in July last year (EiM #71).
👥 People - those shaping the future of content moderation
Sophie Zhang has done as much as anyone in recent years to shine a light on how Facebook deals with speech issues outside the United States. The data scientist, you might remember, wrote an excoriating 6,600 word memo last year noting how the platform was slow to react to coordinated harassment in Azerbaijan and Honduras and then followed that up with more details in a must-read Guardian series in April (recapped in EiM #108).
Sophie spoke to podcast host and longtime EiM friend Patrick O'Keefe in a recent edition of the Community Signal podcast about what she found on the platform and how Facebook's actions are driven by pressure from civil society, NGOs and media. A podcast definitely worth 40 minutes of your time.
🐦 Tweets of note
- "Yesterday however, 'lumpfish' made a mistake" - this wild real-life story from gal-dem's Moya Lothian-Mclean about a forum user give a fascinating insight into how people exhibit troll-like behaviours.
- "Content moderation is hard, Stanford Law School edition" - Daphne Keller notes how a satirical email sent by a student back in January sent the US college into a spin this week.
- "Matt Gaetz's comments last night will be taken by some as encouragement to shoot community, moderation, trust, and safety pros" - the aforementioned Patrick O'Keefe, like many, is right to be worried about the US congressman's recent comments.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.