The new African mod whistleblowers, KOSA goes quiet and Dunn deal
Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben Whitelaw and supported by members like you.
African moderators whistleblowing about working conditions; the Kids Online Safety Act in the headlines; explosive reporting on Meta’s approach to safety — today’s edition might feel a little like déjà vu from 2022. It’s not, but the mix of political hesitation and patchy corporate responsibility remain all too familiar.
The new episode of Ctrl-Alt-Speech just dropped and it might be one of my favourite titles yet. I won’t spoil it — go have a listen wherever you get your podcasts.
If, like Modulate, you want to get your company in front of T&S professionals in a brand safe manner, head to EiM's brand-new Sponsorship page.
On a personal note, I’m due to become a dad in the coming weeks so EiM may be a little less regular than usual. If you have any nappy-changing advice or sleep hygiene tips (for me, naturally), drop me a line — ben@everythinginmoderation.co.
From sunny London, here’s your online safety and content moderation week in review - BW
ToxMod Protects Customer and Worker Safety
ToxMod helps trust & safety operations teams in delivery services, marketplaces, and beyond to understand voice interactions in real time—flagging harassment, threats, and harm before they escalate.
• Real-Time Monitoring: Analyse live voice chats between users and agents
• Scalable & Secure: Built for high-volume environments with strict privacy controls
• Actionable Insights: Prioritise the most serious incidents with contextual signals and reports found in one simple dashboard
Policies
New and emerging internet policy and online speech regulation
Lawmakers in Washington are poised to pass the most meaningful legislation to date targeting deepfake abuse after the Take It Down Act cleared the House with bipartisan support. The Act criminalises the sharing of non-consensual intimate images, including AI-generated content, and directs platforms to remove content with 48 hours or face legal action. Civil liberties groups, including EFF, have warned it could lead to misuse of takedown systems and a move away from encryption in order to comply.
On the subject of much-debated US safety legislation, The Verge has written a good explainer on the Kids Online Safety Act (KOSA), which has gone quiet in Donald Trump’s first 100 days. Despite support from child safety groups and Big Tech-sceptical Republicans, the bill is yet to be reintroduced into Congress and now faces “long odds of passage”. Critics will be pleased: KOSA’s vague provisions led to concerns that it could harm marginalised kids and be used to remove LGBTQ+ content as well as anything else the White House administration doesn’t like (T&S Insider).