4 min read

πŸ“Œ Avoiding heated discussions, Facebook's fightback and calling Collins

The week in content moderation - edition #129

Welcome to Everything in Moderation, your roundup of the best content moderation and analysis from the last seven days, curated and produced by me, Ben Whitelaw. Thank you to everyone that sent comments and congratulations on the new site.

The theme of speech regulation runs right through today's newsletter, perhaps providing a taste of what the next 18 months will look like. I don't know about you but I'm certainly gearing myself up for the legislative onslaught.

As always, whether you're an online safety insider or an interested onlooker, there's something for you β€” BW

πŸ“œ Policies - emerging speech regulation and legislation

The Oversight Board has concerns with the company that pays its wages following last week's explosive Facebook File reports published by the Wall Street Journal (EiM #128). According to the Financial Times, Board members have requested a briefing with Facebook execs to clarify the platform's use of Xcheck, the internal system that ensured violating posts by celebrities and notable users were not immediately taken down. It will publish a report in October.

If you can look past the dubious headline, this Lawfare piece by Mattias C Ketteman and Torben Klausa does a good job of explaining the significance of two recent German High Court judgements on content moderation procedures. It also explains a key German concept that unsurprisingly didn't come up during my A-Level classes: Drittwirkung, which translates roughly as "third-party effect". Worth your time.

The Texan law that prohibits platforms with more than 50 million US users doing some forms of content moderation is subject to a complaint just a few weeks after it was passed. Two industries bodies argued this week that House Bill 20 violates the US constitution on multiple levels and should be scrapped.

πŸ’‘ Products - the features and functionality shaping speech

Twitter will introduce a way for users to leave a conversation that they have been tagged and will add a "heads up" message when users join a "potentially heated discussion". The new safety features were announced by the company's Product Lead for Conversational Safety Christine Su in a call with reporters this week and will evolve to allowing users to "set their own tones" of conversation. No mention of a rollout schedule or beta but will be interesting to see where this goes.

Not a product per se but a new campaign has called for Facebook to provide better transparency reporting on content moderation decisions as one of its four key demands. The Facebook Logout, organised by the Kairos Fellowship, encourages people to log out of their account in protest against "the real-world consequences to Facebook's irresponsibility". Its wishlist also includes an improved appeal process to a neutral decision-maker, which is included in a number of online speech bills and draft legislation.

πŸ’¬ Platforms - efforts to enforce company guidelines

Twitch might have said that it is working to end the "hate raids" that make black, female and LGBTQ+ users lives a misery (EiM #123) but it's yet to filter down to the streamers themselves, according to interviews with the BBC. "The only thing I've seen is they've acknowledged it's a problem and people talking about it. But it doesn't feel like it's done anything to stop it happening," said one.

A lot is riding on the recent case brought by Twitch against two of its own users (EiM #128). Issie Lapowsky at Protocol writes that, if nothing else, it represents a "signalling exercise" which demonstrates that the Amazon-owned platform is ready to dedicate resources to ridding violating users of its service. My read of the week.

The fallout from last week's Facebook Files continued this week as the company responded with an unbecoming blog post that alleged that the WSJ's reporting contained "deliberate mischaracterisations" and "confers egregiously false motives" on its leadership. Nick Clegg, VP of Global Affairs, finally came out of hiding too after almost a week of stonewall silence. There's an interesting piece to be written about content moderation and its relationship with the media β€” hopefully, I can get around to putting something down on paper next week.

Finally in this section, gaming platforms like Roblox and Minecraft continue to be used by right-wing extremists to spread hate, according to a new report. Although not pervasive, researchers found custom maps in which users invited players to "become a racist" and simulate the running over of black and ethnic minorities. Which is more than enough to be concerned about, in my opinion.

πŸ‘₯ People - folks changing the future of moderation

Few UK members of parliament are as deeply invested in online safety as Damian Collins. The MP for Folkestone and Hythe has been a long-term critic of social media and even created his own fact-checking service and podcast on the topic β€” Infotagionβ€” last year.

This week, Collins chaired the joint committee on the draft Online Safety Bill, asking probing questions about what makes Wikipedia different from other platforms when it comes to misinformation and highlighting the importance of regulatory audits of platforms. You can read a thread of quoted highlights here.

Taking inspiration from the Facebook Files revelations, he also called for fines to be given to platforms that are found to withhold information from Ofcom, the proposed future regulator of the Online Safety Bill.

His comments, however, haven't always given the sense that he knows as much as he thinks he does. And just a few weeks back, Collins called for the end of user anonymity on platforms, which policymakers and rights experts realise have all kinds of issues attached. One to watch.

🐦 Tweets of note

  • "Content moderation at the platform level is broken; infrastructure moderation is likely worse" - EFF's Corynne McSherry on the problems coming down the track with macro moderators.
  • "Since networked activism is also coordinated activity, policies in this area have often been very vague. No clear line." - Renee DiResta on Facebook's latest policy change.
  • "838 comments later, 7 people have been blocked, 2 mods quit and the group has been made β€˜private’" - Some light relief from comedian Sofie Hagen, who sees the funny side of a paint row in a DIY enthusiasts group.