Welcome to Everything in Moderation, your weekly newsletter about content moderation by me, Ben Whitelaw.
A special welcome to new subscribers from Hanover Communications, Automattic, the London School of Economics and elsewhere.
A notable development this week, particularly for the media watchers among you: content moderation is now its own beat. Certainly worth following Cristiano and I look forward to including him here in EiM.
Onto this week's news round-up — BW
📜 Policies - company guidelines and speech regulation
Justice Clarence Thomas was not a name I knew until this week (I'm not very clued up on the US Supreme Court, you see) but his comments on the recent Trump-blocking-Twitter-critics case — in which he argued that for social networks to be viewed as a public utility and thus should be "regulated in this manner" — mean I, and many others, know him now. Changes to Section 230 are unlikely at this stage, in part because other justices didn't join Thomas in his statement. However, that didn't stop right-minded people in my timeline expressing mixed reactions about the prospect of this.
Elsewhere, Russia's state communications regulator announced it would continue to slow down the site speed of Twitter until May 15 for failing to remove what it believes encourages illegal activities. In March, Roskomnadzor requested Twitter remove 3,168 posts including drug abuse and child abuse dating back to 2017. The splinternet is real, people.
💡 Products - features and functionality
Pinterest this week launched a handful of new features that it hopes will make it a 'positive and inspiring place'. For creators, it has added the ability to filter and highlights comments (like Instagram in July 2020) and a new Creator Code will act as a kindness oath (although there's no mention of how they change moderation) For users, popups imploring the use of civil language will also roll out in the coming weeks.
In a straight copy of Twitter's reply controls (launched August last year), Facebook now allows users to manage who can comment on published posts. As a someone who has had long discussions with senior editorial lawyers about the liability of comments on court stories posted to Facebook, I'm very glad about this development.
💬 Platforms - dominant digital platforms
How well are platforms moderating content on their sites? How do they know if their efforts are working? Well, we now know how YouTube think about this challenge. This week, it published a blog post about "violative view rate” — the number of views of problematic content per 10,000 views — and how it tried to get that number as close to zero as possible. Of course, it's all part of YouTube's effort to be seen to be tackling the problem of problematic content but that didn't stop Guy Rosen, Facebook's VP of Integrity, being very kind about it.
Meanwhile, there were two other transparency reports published this week that are worth noting:
- Discord's growth in the second half of last year caused a large rise in user reports and server deletions, according to its latest transparency report. Interestingly for those working in trust and safety operations, it has broken out three new categories — Child Sexual Abuse Material (CSAM), Extremist or Violent Content, and Self-Harm Concerns — to be able to respond to them more quickly. Sensible move.
- Pornhub released its first report following the overhaul of its moderation capabilities after the New York Times reported how it monetised rape and revenge porn (EiM #91). Not much we didn't know before.
👥 People - those shaping the future of content moderation
Jillian C York really should've appeared in this slot before now. The director of Electronic Frontier Foundation is one of the leading voices on the impact of moderation on free speech and also co-convenes, with Dia Kayyali, a very useful mailing list for advocacy and campaigns related to online speech.
I've included her work and tweets in EiM countless times and now she has published a new book, Silicon Values, looking at an online speech in the context of her own experience working on research and living outside the US. Jillian talks to Mike Masnick on Techdirt's recent podcast about how their views have evolved over the years. Can't help thinking it's like listening to Pele and Maradona talking about football. A must-listen.
🐦 Tweets of note
- "Life is just one big content moderation debate" - Nu Wexler on the row between Swedish retailer H&M and Chinese authorities in Xinjiang over, wait for it, a map.
- "Suddenly have no idea what it is or how to explain it" - Dr Ysabel Gerrard on what happens when you're asked to condense all of your work into a book chapter.
- " you can/should eat the kiwi like an apple (with skin on)" - Sociologist and scholar Francesca Tripodi on the wild things you can learn in the comments section.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.