KOSA returns, abortion speech under threat, and OpenAI’s safety trick
Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben Whitelaw and supported by members like you.
One of the challenges that Week in Review tries to solve for is the constant moving of goalposts in the T&S space; things move so quickly that it's a 9-5 job to keep up.
This week is no exception with several stories emerging in the last 24 hours. If you find EiM helpful to your work or just interesting, remember that you can become a member for less than the price of a Jim Jordan stamp addressed envelope heading to Brussels.
A big welcome to news subscribers from IFTAS, Der Standard, ActiveFence, The Bureau of Investigative Journalism, The Alan Turing Institute, Viking (ok, I'll take a free cruise), Université Libre de Bruxelles, FullFact and others.
Here's everything in moderation this week — BW
Does your platform have messaging, search, or generative prompt functionality? Thorn has developed a resource containing 37,000+ child sexual abuse material (CSAM) terms and phrases in multiple languages to use in your child safety mitigations.
The resource can be used:
- To kickstart the training of machine learning models
- To block CSAM prompts
- To block harmful user searches
- To assess the scope of this issue on your platform
Apply today to get access to our free CSAM keyword hub.
Policies
New and emerging internet policy and online speech regulation
The European Commission yesterday found TikTok in breach of the Digital Services Act for failing to make public a repository of its advertising. The ruling is a small part of a wider investigation announced in February 2024 (EiM #235).
Henna Virkkunen, the Commission's lead for tech sovereignty, security and democracy, notes that the video-platform's ad library implementation prevented "the full inspection of the risks brought about by its advertising and targeting systems" and that "citizens have a right to know who is behind the messages they see". TikTok may appeal but could be fined up to 6% of global revenue.
Just weeks after I noted the eerie quietness surrounding it (EiM #291), the Kids Online Safety Act (KOSA) has returned, nearly unchanged, in a new bipartisan push. As The Verge reports, the bill now contains language that states "KOSA would not censor, limit or remove content from the internet", which is in direct response to speech rights concerns raised by US civil society groups.
Various outlets noted that Apple is now a supporter of the bill, although trade association Computer and Communications Industry Association — of which Apple is a member — said in a statement that there are outstanding "serious First Amendment concerns".
Also in US speech legislation news: a revealing piece in User Mag this week outlines how US states are experimenting with indirect speech regulation as a way to further limit abortion access. With Texans finding ways to terminate their pregnancies, legislators are targeting online platforms for carrying or amplifying abortion-related content as well as individuals themselves. And the incentives are wild.