6 min read

The new AI governance plan, YouTube's policy walkback and inside American Sweatshop

The week in content moderation - edition #306

Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben Whitelaw and supported by members like you.

A big thanks to everyone who showed up for the Marked As Urgent x EiM London meetup last night. I had a great time and — based on feedback I received — so did everyone else. In fact, I had such a fun evening that I forgot to take a photo. You'll just have to believe me.

We’d love to host more events for people working and interested in tech policy and internet regulation, both in London and elsewhere. If you’re interested in working together or sponsoring an event, email me at ben@everythinginmoderation.co. You can follow Marked As Urgent for futrue events and updates.

There's the usual mix of regulatory updates, platform u-turns, and real-world policy consequences in today's newsletter. Let's get into it — BW


In partnership with Safer by Thorn, purpose-built CSAM and CSE detection solutions for Trust and Safety.

Does your platform have messaging, search, or generative prompt functionality? Thorn has developed a resource containing 37,000+ child sexual abuse material (CSAM) terms and phrases in multiple languages to use in your child safety mitigations.

The resource can be used:

  • To kickstart the training of machine learning models
  • To block CSAM prompts
  • To block harmful user searches
  • To assess the scope of this issue on your platform 

Apply today to get access to our free CSAM keyword hub.

APPLY FOR ACCESS

Policies

New and emerging internet policy and online speech regulation

This week’s 80th UN General Assembly in New York has drawn together a cast of global tech leaders, diplomats, and platform execs, all keen to shape what comes next. It might be dry and the conversations often wonky but it's the only forum for all UN members to raise diplomatic issues — which increasingly intersects with internet governance and regulation.

Here’s what caught my eye:

In the UK, civil society organisations have warned that Ofcom is coming off as too timid in enforcing the UK’s Online Safety Act, suggesting that major platforms are far from “quaking in their boots.” In an interesting shift, 5Rights Foundation and the Molly Rose Foundation also emphasised the importance of privacy in age verification tech which, as we know, isn’t always straightforward.   

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member