4 min read

A moderation 'art project', Reddit's pledge and Casey goes solo

The week in content moderation - edition #81

Welcome to Everything in Moderation, your weekly newsletter about content moderation on the web, pulled together by me, Ben Whitelaw.

If you're a recent subscriber, you may not know that I have a regularly updated Twitter list of 120+ moderation/online speech experts that I use to help me write the newsletter each week. It's public and you're welcome to follow it too.

Let’s get onto this week’s update — BW


📜 Policies - company guidelines and speech regulation

In last week’s newsletter (EiM #80) I noted that Facebook's Oversight Board would launch before the US election and speculated about the reasons for that timing. Well, it now seems that part of the rush was that Big Blue didn’t want a group of critics and activists beating them to it.

On Wednesday, the group of 25 academics, journalists, politicians and former tech employees launched their own board under the could-be-catchier moniker the 'Real Facebook Oversight Board’. NBC News has all the gory details (including some beef with the philanthropic arm of eBay founder Pierre Omidyar’s empire) but the group's aim is to scrutinise everything from bot activity to political advertising to algorithmic amplification via weekly public Zoom calls.

Some called it an ‘art project’ and, from a purely content moderation perspective, it feels like a less focused version of the Transatlantic High-Level Working Group on Content Moderation Online and Freedom of Expression (I wrote about their findings in July). But my take is that, a month from an election, more scrutiny cannot be a bad thing.

💡 Products - features and functionality

It’s easy to herald a new social network as an improvement on what’s already out there, especially when it comes to content moderation. However, Telepath — a new, soft-launched platform — has all the makings of a big step forward: a commitment to in-house moderation, strongly-worded user guidelines and a host of ex-Quora employees on staff. I’m waiting for my invite so I’ll let you know more when I’ve had a look around.

One community I have spent some time in of late is sports subscription outlet The Athletic. It recently changed its app product to include a greater focus on ‘snippets from experts’ — essentially 30-70 word tweet-long updates from its high-profile, well-respected sports journalists. These mini-reports act as the starting point for discussions, behave from a product perspective like Instagram/Facebook/LinkedIn Stories and feel like mini rooms in a sports club.

How The Athletic moderates user contributions is pretty vague but because topics are localised to teams, most contributions (so far and from my own experience) are civil. If you're interested in more, I wrote a thread this week about how these Reaction circles work and how it fits into the site's business model.

💬 Platforms - dominant digital platforms

Content moderation on any online platform is undeniably tied to its business model, which usually means attention-driving algorithms and the collection of large swathes of data for content targeting. I, like many, have beat up Facebook et al for its choices of prioritising scale over safety and those kind of takes (including this strong one from Gilad Edelman at Wired this week) are not hard to find.

Credit then must be given to networks, like Reddit, when it appears that they are thinking about their commercial offering differently. The social news site this week announced that it was opening a news sales office in London this week, which gave Chief Operating Offer Jen Wong the chance to emphasise the various ways that the oft-maligned platform differs from others: ‘We moderate and filter content’; it’s about ‘sharing the responsibility of safety’; ‘We hold (users) responsible’ were just some of the quotes given to The Drum.

We should be sceptical about this positioning— after all, it wasn’t long ago that 800 subreddits sent an open letter to CEO Steve Huffman complaining about its hands-off approach to moderation (EiM #69) — but maybe that letter has led Reddit to turn a corner.

Talking of sometimes toxic communities, Clubhouse, the audio app currently in invite-only beta, has once again proven itself unable to moderate content on its platform. This week, a discussion with 300+ folks on the topic of anti-Semitism in black communities allegedly turned into a succession of tropes and off-topic remarks that the moderator was unable to stop. Yesterday, Clubhouse published a blog post clarifying its stance on abuse and listing how it would improve. Another case of 'don't hold your breath'.

👥 People - those shaping the future of content moderation

As far as shaping content moderation goes, few have done more than Casey Newton in recent years. The Verge reporter has broken several big stories about outsourced moderation and trauma  and I’ve included his stories (and links to his weekly newsletter) here countless times over the last two years.

As of next week, he’ll be going out on his own on Substack with Platformer, which aims to cover 'social networks and their relationship with the world’. Like most Substack newsletters, you get one free article a week in return for your email address but any more and you’ll need to pay $10 a month (if you're interested, I’m still mulling it over). He signed off from The Verge with the typically Casey scoop on Accenture mods being forced to go back to work for Facebook, If there’s much more of that, I’ll have no choice but to subscribe.

🐦 Tweets of note

  • 'Really smart meditation on why effective content moderation is chimerical - and doomed to be thus’ — USC Annenberg fellow and journalist Marc Ambinder recommending cybersecurity fellow Adam Elkus’ new take on Dick Costello’s comments this week.
  • 'Let's recap the whole situation in this thread' — Turkish academic Yaman Akdeniz reviews the process that led to Facebook, Twitter et al having to to appoint in-country reps in Turkey ahead of this week's government deadline.
  • 'Our results indicate that negative impacts of content moderation are uniquely severe for sex working activists and organizers’ — Em from research collective Hacking Hustling on its report on the shadow banning and no-platforming of sex workers that I previewed back in July (#EiM 73). Worth a look.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.