'No free pass' for platforms, pro-AI party vibes in India and Spiegel’s pushback
Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben Whitelaw and supported by members like you.
This week is all about the politics of platform power. From AI summits that sidestep safety to CEOs dabbling in digital sovereignty, there’s plenty of positioning going on.
Last week's Ctrl-Alt-Speech inspired me to write a longer piece on how the rise of creators may shift how internet governance works — plus what platforms and regulators need to do to adjust. If you like badly drawn pyramid frameworks and/or irate TikTokers, you'll want to give this a read.
Welcome to new subscribers from Google, Electronic Arts, EFF, Persona, Duck Duck Go, Sony, the Oversight Board and a steady trickle of other well-intentioned internet shapers. I hope you find EiM useful, challenging and — occasionally — mildly entertaining. If you do, you know what to do.
Here's your seventh Week in Review 0f 2026. Enjoy — BW
What’s the real cost of “good enough” moderation?
Online platforms face growing scrutiny as harmful content continues to cause real-world harm. Corporate Complacency vs. Human Cost exposes how the gap between policy and practice - revealed through Checkstep’s analysis of EU DSA Transparency data - creates serious ethical, legal, and reputational risks.
In the whitepaper, we uncover the hidden crisis behind inconsistent reporting, manual moderation burnout, and misleading safety metrics. We also explore how companies must align their content moderation policies with action to protect users, staff, and brand trust.
Policies
New and emerging internet policy and online speech regulation
UK prime minister Keir Starmer means business. Or at least that's what he'd have you think with his double platform regulation announcement this week:
- On Monday, he announced that all AI chatbot providers would abide by illegal content duties in the Online Safety Act as part of a plan to keep children safer. “No platform gets a free pass” was the motto, even if that doesn't reflect how companies are categorised under the Act. Hey, it's just minor details.
- On Thursday, the Labour leader backed that up by announcing that the UK will require tech firms to remove abusive images within 48 hours or face having their services blocked in the UK. It indicates — to me, at least — a move to a move prescriptive, Australian model.
No brainer: This is politically low-risk move from the UK government. Tech regulation has broad cross-party support, child safety polls well and “tough on platforms” is an easy message to land. Coming just weeks after Starmer's favourability rating sank to its lowest score to date, I'm not surprised at all.
I've not spent as much time as I'd like reading what's coming out of the Los Angeles addictive design trial, in which Meta and Google are defendants. But here's a few pieces I'll be diving into over the weekend:
- The FT's Hannah Murphy reports on Mark Zuckerberg's claim that“utility” and “value” — not engagement — is the priority for users over the longer term.
- Casey Newton over on Platformer called the case a "novel and portent challenge to Section 230" and could force "significant changes to social app design".
- Sky News offers the strange mental image of six lawyers unveiling comically large social media posts for Zuckerberg to review while on the stand.
Hit reply and share anything I've missed.
