What American Sweatshop got wrong (and right)
I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.
Little-known fact: I studied film at college and began my career as a film and TV editor. This week I dusted off those dormant skills to watch American Sweatshop, a new thriller about a content moderator turned crime solver. I’m left with the question: are we happy with how T&S work shows up in common culture?
Thanks to everyone who reached out about last week’s T&S Insider on the rising importance of validating real-world identity. It clearly struck a nerve — and not always in ways people felt comfortable with.
I’d love to hear what you think about the film or about any fraud-prevention work you’re involved in. Here we go! — Alice
How do we measure progress in preventing and detecting online child sexual exploitation and abuse?
Without strong data, it’s hard to know the true scale of the issue, what’s changing, and which interventions are most effective. Alongside UNGA, the Tech Coalition brought together leaders from tech, government, and civil society to tackle this challenge. Explore the insights shared, and see how they are shaping next steps on transparency, measurement, and collective impact.
An insider's take on American Sweatshop
Ben won't stop talking about it (EiM #306). And reviews have been all over the place. But what should we make of American Sweatshop?
Credit where it’s due: the film brings content moderation into the cultural conversation. Yet its execution left me torn — part appreciation for the attempt, part frustration at the missed opportunities and clunky clichés. Here’s my review.
Visuals and dialogue
The movie opens with the camera slowly moving down an eerie tunnel. Suddenly the scene cuts to an office full of computers, and you understand that the tunnel is a metaphor for the internet itself, funnelling content to the moderators in the office. It's dark, you don't know what lurks there, and you don't know what will come out of it.
Later, we learn that an alligator lurks next to the tunnel outside this office. Daisy, a content moderator and main character, says of it: “It’s an ambush predator. They’re more dangerous when you can’t see them. That’s how they get you.” As director Uta Briesewitz explains, the alligator represents the danger that content moderators learn to live with, based on a real story she heard:
"Workers were talking about how they could look out the window and they would see this alligator that had moved in into a little body of water near the parking lot and nobody would really fully acknowledge the danger of him; everybody would just go back to their work."
It’s an apt metaphor for content on the internet, as well as that foreboding feeling that moderators have while doing their work. But the metaphor stretches a little too far when a newer moderator, Paul, asks Daisy what to do if the alligator attacks. “Fight,” she says — and of course, that’s what her character ends up doing.
What they got right
Several elements in the film rang true to my experience as a moderator and showed that someone had done their homework: The high quotas for video review "tickets", the rigid lunch schedules, the flashbacks to the worst of the worst in the middle of the night.
The inadequate wellness solutions felt particularly authentic: nine minutes of allocated wellness time, an ineffective counsellor, therapeutic apps as band-aids, insufficient healthcare coverage. The characters self-medicating with marijuana and alcohol reflects a common reality when proper mental health support isn't available. When I was a frontline moderator, I didn't have health insurance or a wellness program, and I'd self medicate by searching for love in all the wrong places. At one point, the main character Daisy, says:
"This is one of the very few times where people are more reliable than computers. [...] Computers can't feel grief. Which is technically our only real job requirement, if you think about it. They're paying us for our pain."
Other characters, too, were compelling and had flashes of insight that resonated with me. Paul is the new guy in the office who runs home to hug his dog after seeing animal abuse. Bob is a volatile character who copes through outbursts and office betting pools. He delivers the film's most insightful line about the work during a therapy session:
"The ones who aren't yelling, the ones who are just taking it, that's who you need to worry about."
This is painfully true and made me think about my work in a way that I hadn't before. The moderators who internalise everything are often the ones who are traumatised the most. And in the film, this is Ava, who felt like the most accurate character out of all of them. “It all kind of blends together,” she says about porn being stacked up against murder videos. The more you see, the more you become numb to it all.
Where they missed the mark on
The film doesn't get everything right, though.
Most notably, real moderators don't trade war stories the way the film depicts. When veteran T&S workers share what they've seen, it's almost always through black humour, not a serious recounting that may traumatise others. There's a protective bravado in Trust & Safety: "Things don't bother me, I'm fine, everything's fine."
The character Ava came closest to this reality, but even she felt incomplete. If the filmmakers had followed through on Bob's insight — that the ones who seem okay are the most in trouble — then we would have likely seen her unravel. That never happens.
Most frustratingly, the film can't decide if it wants to be a tense character study or a thriller, and that identity crisis undermined its impact. The film's "twist" ending felt forced and overly clever, just when a quieter and more mundane conclusion might have been more powerful.
An outsider's interpretation of insider trauma
This brings me to the central question: who is this film really for, and does it do what it set out to?
Firstly, I understand the commercial pressure to make a film accessible. As producer Jason Sosnoff noted, the goal was to make a movie for a younger audience, and use the themes make them — and their parents — question their social media use. There’s a part in the film where Daisy is shocked that her neighbour’s kid has a Reddit account, and the kid is very nonchalant about it. That felt a little too true to life.
It’s also clear that everyone involved in the film wants people to understand how difficult and traumatising the work of content moderation is. Lili Reinhart, the actor who plays Daisy, said:
“It’s genuinely such a horrifying job. And you would think that this is the one job AI could do, but apparently it can’t do it properly. So we have to torture humans by making them watch this content instead. You question why these videos are even on the internet and how humans could possibly make content like this, but they certainly do. So maybe [the film] will help advocate for more mental health support for people who actually do this job, for sure.”
These are worthy intentions, but I'm skeptical about the execution. The film includes cheesy moral statements like Daisy's "freedom of speech, but not freedom of reach" moment that feel more like screenwriting than authentic dialogue. When Ava observes that disturbing content is "closer than you think," it's the kind of heavy-handed messaging that might turn off exactly the young audience they're trying to reach.
And, most frustratingly, though the film hints at some systemic issues — moderator wellness, children being exposed to disturbing content, the tension between the commercialism of online platforms and the horror of violence going viral — no one proposes any systemic solutions at all. Instead, a young woman tackles just one piece of content by taking matters into her own hands in a dramatised and unrealistic way.
There was a much more powerful film hiding underneath the thriller elements. A straight character study that stayed realistic and built to a heartbreaking and authentic conclusion could have been emotionally devastating. Instead of revenge fantasies, what if we'd simply watched someone slowly breaking apart despite all their coping mechanisms? What if the "twist" was just the quiet realisation that the job had fundamentally changed someone, and there was no going back?
That film might have been harder to get funding for, but it would have honoured the real experience of content moderators instead of sensationalising it. As it turns out — and as Ben pointed out on Friday — the depiction feels like an outsider's interpretation of insider trauma.
Will young people see this and question their social media consumption? Maybe. But what we really need right now is authentic storytelling and systemic solutions, not thriller mechanics and a few kids spending less time scrolling online.
You ask, I answer
Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*
Get in touchAlso worth reading
Is Trust & Safety Dead, or Just Evolving? (Tech Policy Press)
Why? A reflection on a paper on The Evolution of Trust & Safety (worth reading as well!) that wonders where T&S should (or could) go next.
Trump Declares Everyone Who Doesn’t Kiss His Ass Is A Terrorist (Techdirt)
Why? “"Extremism on migration, race, and gender” and “hostility towards traditional American views” are now markers of terrorism?!? Supporting comprehensive immigration reform? Terrorist. Advocating for LGBTQ+ rights? Terrorist. Criticizing discriminatory policies? Terrorist."
This is really terrifying, given that all major platforms are obligated to remove terrorist content. I wonder if they will comply with this memorandum or stick to more internationally recognized definitions of terrorism. If you're working in these areas for a platform, I'd love to know how you're thinking about it.
Online search crackdown fuels Russia's LGBTQ+ censorship (Thompson Reuters Foundation)
Why? A peek into the future of the US, perhaps?
Member discussion