A reader asks: What should be on my ‘red line’ list?
Most T&S professionals—whether they admit it or not—have a line they won’t cross for their company. But when you're in the middle of a major, public failure, it can be hard to know what to do. Here’s my take on what to consider before quitting.
Brussels to go after X, Meta to face Kenyan courts and Substack's subtle shift
The week in content moderation - edition #289
Is it prosocial design’s time to shine?
With some platforms retreating from a reactive, enforcement-driven approach to Trust & Safety, there’s a stronger case than ever to lean into proactive and prosocial practices that prevent toxicity from happening in the first place. Here's where to start.
US raises OSA concerns, OpenAI's 'permissive approach' and moderation on stage
The week in content moderation - edition #288
What I heard at the T&S Summit in London
My first time attending a big T&S event outside of the US brought with it a lot of fun. But I left without as deep an understanding about British or European attitudes to online safety as I'd have liked
New bill to kill Section 230, defederation report and Mohan clarifies Covid-19 rules
The week in content moderation - edition #287
Making the best of a T&S incident (part two)
A Trust & Safety crisis doesn’t just risk reputational damage — it grabs leadership’s attention too. That’s your moment to make the case for investment in your team. Here’s how I would turn a T&S incident into a strategic win.
The OSA is finally here, my notes on Community Notes and Terrorgram explained
The week in content moderation - edition #286
Are T&S professionals part of the problem?
A new report argues that, without industry-wide standards or codes of practice, T&S professionals are vulnerable to corporate pressures and destined to always be reactive to company's conflicting priorities. The answer? Greater independence.
Disputed decisions, an LLM for spotting foreign ops and the book Meta didn't want you to read
The week in content moderation - edition #285