6 min read

How T&S safety jobs got technical

The rapid rise of AI-focused T&S roles is hard to ignore. In the second part of the Safe for Work? series, we look at insights and hard data that shows why ML and AI fluency is becoming core to the job — and what you can do to keep up

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.

It was great to see the interest in last week’s edition on the rise of compliance jobs. If you haven’t read some of the comments about it, please do — lots of smart people with first-hand experience in compliance shared their thoughts. Many shared my view that compliance roles benefit from T&S professionals who understand platform risks and operational realities better than traditional lawyers or compliance experts.

This week, we cover the technical side of T&S and how fast it’s changing. Become a an EiM member to get early access to this job series and full access to all T&S Insider editions forever!

I’d love to know your thoughts – hit reply to send an email, comment underneath if you're an EiM member or drop me a note on LinkedIn. Here we go! — Alice


in partnership with Resolver Trust & Safety
CTA Image

Online safety isn't always about speed. Sometimes, the safest thing you can do is slow people down. 

In Resolver's latest blog from our “20 Years in Online Safety” series, Dr Paula Bradbury, Ollie Clements, and Dr. Sören Henrich from Manchester Metropolitan University explore how behavioural design and friction can stop harm before it spreads — by giving users just enough space to pause, think, and choose differently. 

Because in a world obsessed with frictionless design, protection often begins with a moment of hesitation.

READ MORE HERE

From moderation to ML

Last week, I wrote about compliance roles as the stable future of T&S careers. But the other shift is the transformation of T&S into a technical discipline. If compliance is one pillar holding up the future of this field, AI and technical expertise is the other. 

What’s driving this change is that safety has become productised. Companies — and increasingly regulators — want systems that are scalable, explainable and proactive, not just reactive. That’s a big departure from the “bums on seats” model that long defined content moderation.

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member