6 min read

It's complicated: moderating nudity online

Everyone should have the right to express themselves joyfully and openly, and that includes the expression of sexuality. But moderating nudity is more complicated than it looks

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job. This week, I'm thinking about:

  • The complicated ecosystem of online nudity policy and product features.
  • Resources to learn more about AI Red Teaming.

Get in touch if you'd like your questions answered or just want to share your feedback. Here we go! — Alice


It's complicated: moderating nudity online

Why this matters: Everyone should have the right to express themselves joyfully and openly, and that includes the expression of sexuality, but this is more complicated than it looks.

Our struggles with defining pornography and obscenity predate the internet.

In 1964, U.S. Supreme Court Justice Potter Stewart famously stated "I know it when I see it" when asked to describe his test for obscenity. But, it’s not as easy as Justice Stewart claimed, and over the years, tech policy experts have battled with describing these concepts in a fair and unbiased way. 

Rules, enforced evenly?

In order to make a platform’s policy operational, the policy must be able to be enforced fairly. This means the policy must describe specifically what poses, states of undress and amount of skin are allowed or not allowed in an unbiased way. No matter what someone’s gender, sexuality, race, size, or age, these rules must be enforced evenly. 

In practice, this is almost impossible. Our perception of what is obscene is a cultural and social construct. We are used to seeing pop stars — usually thin, young, white women — in skimpy clothing posing in provocative ways. But because this is a culturally normal display of sexuality, it’s not seen as obscene. Compare this to a drag queen (or a fat person, or a gay man) posing in a similar way with a similar outfit, and it often is seen as obscene.

Kissing is the same: in India, it is seen as an extremely sexual act, but that’s not a big deal in the United States. And according to almost every platform policy, men can freely show their nipples, but women can’t.

That's before we even consider nonbinary people. Unfortunately, the tech world has completely left out those who are agender, pangender, genderqueer, genderfluid, or gender-nonconforming and pretty much acts like they don’t exist when it comes to sexual content policy around nipples and nudity. Trans people also aren’t taken into account properly, as I wrote in this whitepaper that I co-authored at Grindr.

Where does this policy pinch come from? Well, many platforms are very conservative about removing sexual content because they don’t want to be removed from the Apple App Store or Google Play Store. The app stores themselves have vague policies on prohibited sexual content: Apple refer to it as “activities intended to stimulate erotic rather than aesthetic or emotional feelings” while Google call it “content intended to be sexually gratifying.”

I came up against this when I had to justify Grindr's sexual content policies to a very senior policy lead at one of the App Stores. They were concerned with the photos in Grindr’s grid that showed people in their underwear but with the head cropped out. This person believed that these images focused attention on the genital region because the face had been removed, and therefore thought that they were overly sexual and against App Store policy.

They thought the images would be fine if they included the face (and similar photos can be seen on much more mainstream apps like Instagram). I had to point out that on Grindr, many users cropped their faces out due to privacy concerns, because it’s not safe to be publicly out as LGBTQ+ in many places. It would be a double-standard to censor these images only for LGBTQ+ people, especially given that Grindr is an app for adults only, unlike Instagram. Luckily, I was able to change their mind.

Categorising nudity

While there are clear challenges, some recent product changes suggest that how we moderate nudity is evolving.

This week, Amazon Rekognition announced that a third-tier taxonomy in their machine learning labels for images. Historically, its “Explicit Nudity” label included female nipples as well as male and female genitalia and exposed buttocks, which isn't granular enough for many platforms. For example, Grindr does allow buttocks in some circumstances, and on a platform with a large percentage of nonbinary and gender nonconforming people, automated detection for “female nipples'' often gets the user’s gender wrong.

However, with the launch of third-tier labelling, a platform like Grindr could now potentially automatically reject all genitalia, but manually review buttocks and nipples, instead of having to manually review everything in the “Explicit Nudity” category. 

Also announced this week, Instagram will start asking users if they’re sure before sending a nude photo in DMs. This is something that Apple already does with their Communication Safety features. (Interestingly, Meta has no plans for similar controls on Whatsapp or Messenger at this time). This feature is a protection against financial sextortion, which is an increasing concern and sadly has been a factor in the suicides of at least 20 teen boys in the US, according to the FBI.

In a world where realistic AI generated nudes can now be created of anyone, it's clear this is a threat that is evolving and that platforms need to respond to. Women and girls are disproportionately victims of deepfake nude creation, with middle schoolers being arrested for creating AI nudes of their classmates. We also saw the first ever criminal conviction for cyberflashing this year so these product features are increasingly important.

Personally, I think that blanket laws against cyberflashing can be problematic, and that the context of the space or platform needs to be taken into account. Regardless, giving users personal control over how and when they see nude images can only be a good thing. It’s also critical to educate people on the dangers of sharing nudes online, and tell them what to do if they are the victim of sextortion or image-based abuse.

What's next for the war on sex

Putting platform feature controls into the hands of users is certainly the way forward, and I would love to see even more granularity available on platforms, especially for public feeds, allowing consenting adults more freedom of expression. Moderating sexual content with a heavy hand, like many platforms do, disproportionately affects marginalised communities, including people who are LGBTQ+, fat, Black, and sex workers. 

That said, in the United States, a rabid minority is blatantly attacking gender, sexuality, and the first amendment, claiming that pornography should be outlawed, that any mention of LGBTQ+ love is obscene, and that children should be protected from trans people. As a member of the LGBTQ+ community myself, and the proud stepmom of a trans person, I find this especially abhorrent. Nearly 30% of Gen Z adults in the US identify as LGBTQ+ and so these attacks on the queer community are not insignificant. It’s not a coincidence that the far-right is conflating the separate concepts of LGBTQ+ identity and pornography, and going against both at once. 

Unfortunately, our work as T&S professionals is being impacted by the culture war playing out across our societies, which make it even more important for platforms to take principled stances on freedom of expression for all their users.

Our aim should be that everyone has the right to express themselves joyfully and openly, and that includes the expression of sexuality. As practitioners, we can find a balance between freedom of expression and appropriate controls to protect children online. And as individuals, we can stand up against censorship, transphobia, and fear-mongering, and speak out against laws that seek to further perpetuate these bigoted ideas.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

T&S Insider reader writes...

Dear Alice,
I'm particularly interested in the work you did for Open AI in Adversarial Testing - is there any chance you could direct me to how to learn more about gaining that kind of skill?

Ultimately, the best way to learn is to do it yourself. One interesting exercise is to play around with older LLM models and then compare to newer ones, and see how they react differently. Here are some resources and articles that you may find helpful:

If you want to learn more about Generative AI generally, especially the social, legal, and ethical risks, I also recommend the book Introduction to Generative AI (full disclosure: the authors are friends, and I was a test reader before it was published).


Also worth reading

Watch Out, AI is getting more persuasive (Ctrl-Alt-Speech Podcast)
Why? I was a guest host of this podcast with Ben, founder of this newsletter. We discussed what happened in T&S news this week.

Elections Playbook for Startups (Responsible Innovation Labs)
Why? A guide to why and how startups should prepare for upcoming election cycles responsibly.

Elections Coalition Playbook (Anchor Change)
Why? A guide to setting up election coalitions effectively.

Responsible AI Governance Maturity Model Hackathon Report (All Tech is Human)
Why? Hackathon participants worked on improving the Responsible AI Governance Maturity Model, a framework for evaluating the social responsibility of AI governance in organisations that develop or use AI systems, based on the NIST AI Risk Management Framework.