6 min read

It's time to take user education seriously

If most platforms are pretty good at safety, they're terrible at educating users about it. That needs to change.

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.

This week, a special edition about user education co-written with my T&S bff, Rachel Kowert, who you may know from her Psychgeist newsletter or YouTube channel of the same name. We're co-hosting a workshop at the Trust & Safety Summit in London soon if you want to see us in person. I'll also be speaking on a panel on the last day of the summit. If you're there, come and say hi!

As always, get in touch if you have thoughts or questions about today's edition. Here we go! — Alice


SPONSORED BY Trust & SAFETY SUMMIT. Europe's Premier Event for Trust & Safety Leaders

The Trust & Safety Summit is just one week away, bringing the global T&S community together at Novotel London West, and this is your last opportunity to secure a pass before registration closes.

As the premier event for Trust & Safety leaders, this year's program dives into the modernisation of T&S: from AI adoption and evolving team structures to political pressures, regulatory enforcement, and the new realities shaping online safety. You'll hear from leaders at Uber, OpenAI, 2K, Match Group, and many more.

Use code EiM20 for 20% off Diamond–Bronze passes (vendor passes excluded). Secure your seat now, before it’s too late.

REGISTER ONLINE HERE

Safety without user education is pointless

By Alice Hunsberger and Dr. Rachel Kowert

Ask any T&S professional working at a platform to describe their programme and they'll talk about policy (what the rules are) and tooling/operations (how the rules are enforced). These two pillars of the T&S Triforce are most legible to leadership because they're quantifiable;  for example, violations actioned, appeals resolved, and response times logged. This also means they get the budget, headcount, and attention.

T&S user education rarely gets the same treatment. It shows up in the form of a help centre nobody reads or a "digital literacy initiative" that lives perpetually on the backlog. It's not that T&S leaders think education is unimportant; it's that it's genuinely hard to measure and easy to deprioritise — the classic "important but not urgent" project.

But the research actually shows that community guidelines that users understand can be a stronger predictor of healthier community behaviour than moderation efforts alone. Education isn’t less important than policy or tooling, so why is it given less time and resources?

"Digital literacy" has become meaningless

"Digital literacy" has become one of those phrases that sounds important enough that nobody pushes back on it, but vague enough that nobody actually does anything about it either. People mention it in grant proposals or policy briefs and it signals that they care about users, but it often doesn't include enough detail about what digital literacy actually means in practice.

The problem is that digital literacy looks completely different depending on who you're trying to reach, and in what context. A parent asking how much screen time is too much needs something different than a teenager navigating their first experience with online harassment, which is different again from a community moderator trying to figure out how to handle a mental health crisis in their Discord server. 

There are, of course, examples of digital literacy done well; for example Discord’s Safety Library, Tinder’s Safety Center, and Google’s Be Internet Awesome program. But these are the exception rather than the rule.

We need to get specific about which users, what knowledge, how we're delivering it, and who is responsible. Most importantly, about what it looks like when done well and what, specifically, success looks like. If we don’t, "digital literacy" is just a way of feeling like we're solving the problem without actually solving it. Which brings us to the next point...

Nobody owns it

The next story might sound familiar to anyone who’s worked at a mid to large-sized platforms.

I (Alice) previously worked at a platform that had hardly any proactive education around the community guidelines. Because I recognised the importance of users understanding our guidelines, I wanted to create a campaign that explained our rules in a relatable way before they had a bad experience. But it required resources from product and engineering, and those teams more or less said "sounds like a comms or policy issue, can't you do it outside of product?" Communcations had their own priorities and wouldn’t spearhead it, and told me to come back when product would prioritise it. Legal wanted sign-off but not ownership. And so it sat in the middle of the org chart, belonging to everyone and no one, and never getting prioritised.

This kind of problem isn't unique to user education. Plenty of cross-functional work falls into this trap. But education is particularly vulnerable because it's hard to measure, doesn't have a natural home in most T&S org structures, and rarely has a dedicated budget line.

The result is that user education at most platforms is piecemeal, inconsistent, and almost entirely reactive. For example, a help centre article written in response to a spike in tickets, or an app tooltip added after a major policy change. In my example, I ended up assigning my T&S team to write help centre articles and link to them in our customer support emails and signatures to try and get the word out proactively, but it never got prioritised in the product in the way I would have hoped.

Proactive, intentional education programs are the exception, not the rule. There are very few programmes out there, and they’re generally scattered and rarely kept up to date. But there’s no reason why they can’t work when given the right resources.

The business case is stronger than you think

Education stays on the backlog because it's hard to measure, and it's often seen as important but rarely urgent. But this doesn't mean the return on investment (ROI) isn't there.

Think about what poor user education actually costs. Users who don't understand community guidelines generate more reports, more appeals, and more support tickets, all of which require human review and cost real budget to handle. Users who have a bad experience because they didn't understand the rules, or didn't know a tool existed to help them, don't just leave, they tell other people about their experience. If you ban a user for a minor rule they didn't know existed, and you have lost MAU and revenue because of it. Finally, platforms that react to safety crises with emergency help centre updates and rushed product changes spend far more than they would have on a proactive education programme in the first place.

There's also the trust dimension. Users who feel informed and supported are more likely to use reporting tools correctly, more likely to stay on the platform, and more likely to give it the benefit of the doubt when something goes wrong. This is even more the case when community guidelines reiterate company values that the user shares. This isn't a soft metric — it ties directly to retention, brand reputation, and long-term sustainability.

The business case for user education is there. It just rarely gets made, because it's hard. 

The internal and external problems are the same skill

Whether you're trying to get a CFO to fund a digital literacy programme, convince a product team to prioritise user education, or actually reach a user before they have a bad experience, you're doing the same thing. You're taking something complex and making someone who has never had to think about it before actually care about it. That's a communication problem, and it's one that most T&S professionals were never trained for.

The same information has to look completely different depending on who's receiving it. A business case for leadership leads with risk and ROI. A pitch to a cross-functional partner leads with shared ownership and reduced burden. A user-facing explainer leads with reassurance, not policy language. Same underlying belief, completely different frame. Getting good at recognising which frame you need, and actually executing it, is one of the most underleveraged skills in T&S.

We’ll be exploring exactly this at our upcoming workshop at the Trust & Safety Summit in London, but you don't need to be in the room to start practising. Next time you're making the case for something education-related, try writing the one-sentence version for three different audiences before you write anything else. It's a small exercise, but it has a way of clarifying what you actually believe, and what you're actually asking for.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Also worth reading

Do All Jobs Suck Right Now? (Culture Study)
Why? Pretty much... yes, yes they do.

Human-in-the-Loop Is Not Enough: Rethinking AI Safety for Autonomous Systems (Camille Stewart Gloster)
Why? "Much of today’s conversation focuses on how to supervise AI systems. A more durable approach focuses on how authority is delegated, constrained, and enforced across technical and organizational systems."

Tracking Efforts To Restrict Or Ban Teens from Social Media Across the Globe (Tech Policy Press)
Why? A useful resource to bookmark to see how many countries have internet age restriction legislation going on.

Building AI-ready Trust & Safety Teams (Musubi, by me)
Why? I'm thinking about developing this into a longer course with exercises and resources (for both individuals and leaders)... would love to know if you think this might be helpful!