Hello everyone and welcome to the 50th edition of Everything in Moderation (🎉).
I had no idea what to expect when I started this little experiment back in 2018 but I’m chuffed to have almost 200 of you subscribed and reading each week (as well as a few supporters). Thanks for allowing me into your inbox.
Before you dive into this week's edition, I have a question: which people doing or thinking about content moderation would you most like to hear more from? I’m thinking about doing a Q&A series but would love your thoughts on where to start and what to ask them. Drop me an email.
Thanks for reading — BW
I’m all for radical ideas when it comes to content moderation (EiM #32). So I was glad to see EFF's Jillian C. York ask this big, bold question this week.
The replies are worth sifting through but here are a few I liked:
- Moderators are chosen by and accountable to the entire group (@elplatt)
- Public participation in drafting guidelines (@leilzahra)
- Moderators as real employees with real benefits (@Adamconner)
- Users retain the power of what they see (@Josephseering)
Check out the thread and wade in with your ideas.
PS: Do check out Jillian’s ongoing interview series on free speech. There's a great quote from this week’s interviewee, Evan Greer (deputy director of non-profit advocacy group Fight for the Future)
"I don’t get jumping to “let’s do more of the thing they’re doing a bad job at” when we haven’t even fixed the fact that they’re doing a bad job of it"
Hard to argue with that.
Giant and growing
Every year, Ben Evans, Andreessen Horowitz analyst and consultant, does a big macro tech trends presentation. This week, he published the 2020 edition, titled ‘Standing on the shoulders of giants’, which he first gave at Davos in January.
It contained a few slides on content moderation in which he made the following points:
- There are many publishing forms (advertising, groups 1:1 messaging) that makes the singular notion of taking something down difficult nor does it solve the thing you looked to solve.
- Democracies have different approaches to free speech (the difference between neighbours Spain vs France is very stark) so good luck creating rules for everyone.
Although small, the inclusion is notable: moderation hasn’t appeared in any of Ben’s 200+ slide decks going back to 2013. Which means something, even if I’m not yet sure if it’s good or bad. Anyway, watch the video of Ben's Q&A.
I meant to include this in last week’s EiM on Facebook's Oversight Board. A good read from Harvard Business Review.
The company is pitching self-governance as a way to fix problems like hate speech and disinformation.
Moderating comments for a short period of time can have a negative effect on how mods think about their own organisation, according to a new study from the University of Texas School of Journalism.
Twitter will show warning labels on images designed to mislead users, it announced this week.
Under the new measures, users who retweet or like doctored and manipulated content will be shown a warning before doing so.
A publishing engagement platform I didn’t know existed has bought a commenting system I’ve never heard of but maybe I'm not on the pulse like I used to be.
Insticator, a startup helping publishers add to their content elements like polls, quizzes and suggested story widgets, has made its first acquisition
This Bloomberg piece outlines how Republican Senator Lindsey Graham plans to force social platforms to remove child exploitation content by making Section 230 protection contingent upon it.
Proposed legislation goes after the tech platform in multiple ways
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.