3 min read

'Moderation in infrastructure', NYT's Groups fiasco and talking governance

The week in content moderation - edition #104

Welcome to Everything in Moderation, your weekly newsletter about content moderation written by me, Ben Whitelaw.

A big welcome to new subscribers from Workwell, Snap, Vistage and Wired. Pleased to have you all on the list.

I'm sending this week's newsletter a day early because I'll be firmly out of range tomorrow. But there's plenty in here for you to get stuck into.

Here's what you need to know — BW


📜 Policies - company guidelines and speech regulation

Facebook this week launched its very overdue human rights policy which it will apply to its "apps and products, policies, programming, and overall approach". As part of its commitment, it will report critical issues to its Board of Directors (I hoped that happened anyway), publish an annual report and create a fund to support human rights defenders.

Ben Thompson, of Stratechery newsletter fame, has interviewed four CEOs in charge of 'infrastructure' services — Stripe, Microsoft Azure, Google Cloud and Cloudflare — about how their companies are thinking about their takedown responsibilities. It's a good read with a consensus forming around the idea that these 'middle-stack companies' are "better off preparing for geographically-distinct policies in the long run, even as they deliver on their commitment to predictability and process in the meantime". Sounds like a sensible model for all moderation services, wherever you are in the stack.

Remember Ireland's Online Safety and Media Regulation Bill, designed to protect children online and launched in draft form in February? Well, the Irish Human Rights and Equality Commission (IHREC) has said it "lacks legal certainty". Which sounds like the worst thing you could say about a piece of legislation, in all honesty.

💡 Products - features and functionality

If you've worked in an audience-facing role in a newsroom over the past decade (as I have), you will have created a Facebook Group or three. Unfortunately, it was always the best (and quickest) way to set up a 'community' around a topic and to validate assumptions about your readers. I never actually wanted to set one up.

So when I hear that the New York Times' Cooking group (membership: 77,000) has turned into a cesspit and that the editors and mods are pulling out, I think once again: when will news product people build (or bring in) the tools their teams need to host great conversations.

💬 Platforms - dominant digital platforms

Facebook's oversight board is "dangerous" and "will force regression in Facebook’s already lax moderation policies" according to a second-year Harvard law student in one of the best critiques of Facebook's oversight board I've read. Jeremy Lewin also notes that scientists or economists are conspicuously absent because "Facebook wants the benefit of speech-protective legal doctrines, not a quantification of the externalities of harmful speech."

This slipped through the cracks last week: YouTube announced that it has removed over 30,000 videos that made misleading or false claims about COVID-19 vaccines over the last six months. Over 800,000 videos have been removed since February 2020. Is it me or does that not feel like a lot?

👥 People - those shaping the future of content moderation

Not one person this week but the group of people who have organised the First Annual Conference of The Platform Governance Research Network, a free online event taking place between March 24-26 about, you guessed it, platform governance.

25 event organisers from 15 collaborating institutions, from Berlin to Bengaluru, have put together 12 sessions on everything from under-studied platforms to governing toxicity. Check the full programme here (or in threaded form) and consider signing up.

The most exciting part, for me, is the time set aside on day 3 to discuss creating a platform governance research network (which apparently doesn't exist right now). Better connections between scholars and civil society professionals can only be a good thing.

🐦 Tweets of note

  • "Thousands of young people are really, really worried about this" - Writer and freelancer research Jay Owens notes the thousands of #SaveAnonymity tweets at the start of this week (I hope to come back to this next week).
  • "Important piece that highlights why online abuse is not just a problem for social media companies to solve - it's for all of us, including employers." - Internews CEO Jodie Ginsberg reacts to a Guardian story about the horrendous targeting of sports journalist on social media.
  • "Journalists are now starting to refer to their incessant censorship demands as "content moderation challenges." - Glenn Greenwald racking up the retweets by turning against his own.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.