User reporting isn't the magic fix some people think it is
Despite their ubiquitous use, user reports don't always drive effective moderation or meaningful change in platform policy. Is there a better approach?
The new(ish) job roles in T&S
As the Trust & Safety industry matures, we're seeing new types of role emerge that didn't exist five years ago. For each of them, a working knowledge of AI is the bare minimum.
Are we getting moderator well-being all wrong?
New research on wellness programs for moderators shows we’re still far from ensuring that the people doing this emotionally demanding work are truly supported.
Ten things no one tells you about working in T&S
Newcomers to the Trust & Safety world often ask me what's it like to work in the industry and the things I wish I'd know before I started. So here are my ten hard-won lessons for the next generation of online safety professionals
Social media use is changing, but why, and what does it mean for T&S?
Fewer users doesn’t mean fewer risks — bad actors thrive when harm is concentrated among smaller, more active audiences. Platforms must move beyond user reports to stay ahead.
Fair moderation is hard but fair, scalable moderation is harder
Throughout my career, I’ve struggled with the problem of how to enforce policies in a fair, accurate, and scalable way. A new research paper reminds us just how difficult that is
A reader asks: What should be on my ‘red line’ list?
Most T&S professionals—whether they admit it or not—have a line they won’t cross for their company. But when you're in the middle of a major, public failure, it can be hard to know what to do. Here’s my take on what to consider before quitting.
What I heard at the T&S Summit in London
My first time attending a big T&S event outside of the US brought with it a lot of fun. But I left without as deep an understanding about British or European attitudes to online safety as I'd have liked