4 min read

📌 ‘Ugly content’ and uglier policies

The week in content moderation - edition #56

Hello everyone and congratulations for making it to the end of the week. There continues to be an avalanche of COVID-19 related moderation news so I’ve kept the section that I started last week.

Remember, my calendar is open to any EiM subscribers that are at a loss during isolation and want to say hi/have a virtual lunch/discuss my love of kekehs.

Thanks for reading — BW

PS  If any of your colleagues, friends or family need a content moderation newsletter to help them through self-isolation, please do forward this on to them.


😱 Who are you calling ugly?

Let’s imagine for a second that Feroza Aziz looks altogether differently than she does.

Say, like so many 17-year-olds, that she had acne or (like this former teenager) was slightly overweight. Pretend she had a lazy eye or, say, a congenital birth defect.

If she had such a such trait or condition, we might not know the extent of TikTok’s unethical content moderation policies. Let me explain.

Back in November 2019, Aziz exposed TikTok’s treatment of Muslims by posting a video of her curling her eyelashes while telling her viewers about Chinese treatment of over 1 million Uighur Muslims. Her video got 1.5m views before being taken down for an hour and then reinstated. A 'human moderation error’ was deemed to be the cause of the takedown and TikTok apologised. But the coverage of the incident was widespread and scathing.

(Instagram screengrab)

Fast forward to this week when The Intercept revealed that TikTok’s moderation guidelines encouraged the suppression of uploads by (wait for it) unattractive, poor, or otherwise undesirable users to the 'For You' section because it was believed to “decrease the short-term new user retention rate.”

As the article outlines:

“Abnormal body shape,” “ugly facial looks,” dwarfism, and “obvious beer belly,” “too many wrinkles,” “eye disorders,” and many other “low quality” traits are all enough to keep uploads out of the algorithmic fire hose.

TikTok responded by saying that the policy is no longer in place but it’s hard to believe something like this ever existed at all. The fact that the ByteDance-owned company claimed it was an ‘early blunt attempt to prevent bullying’ frustrates me ever more. It’s an ugly policy that should never have existed.

The thing that really scares me is this: if Feroza Aziz did not look the way she does, if she was not an attractive young woman in the TikTok-approved sense, then her video exposing TikTok’s moderation policy wouldn’t have got as many views as it did in the first place. In simpler terms: It was the platform's unethical appearance-led moderation policy which showed itself to be what it is.

Free speech is a very important principle but what we’re talking about here is free appearance, the idea that you should not be censored for the way that you look.

I shudder to think where this ends up.

+ Bonus: TikTok announce that they will no longer use Chinese moderators to moderate content outside the country (Engadget via WSJ)

🏥 Public health platforms? (week 2)

Here are this week’s coronavirus stories with a content moderation angle.

☎️ Europe on the line

The newly-formed Council of Europe’s Expert Committee of Freedom of Expression and Digital Technologies (#EiM 51) has, like everything, has fallen foul to the coronavirus. The committee kicked off its work this week to produce a guidance note on content moderation using BlueJeans. Good luck to them.

🃏 The humble tummler

Annalee Newitz, a contributing writer at NYT Opinion, has written what can only be described as an ode to the content moderator. In it, she explains how the best moderators are 'the paramedics, the law enforcers, the teachers and the curators’ of our web experience, likening them to the role of a tummler, a kind of entertainer found at the Catskill resorts in 50s and 60s America. Nice analogy.

Not forgetting...

Livemint has a good piece about the Indian moderation companies popping up. Some staff are having to view 6000 videos a day and have less than 5 seconds to make a decision on each piece of content.

Inside the secretive world of India’s social media content moderators

Inside the secretive world of India’s social media content moderators

From Covid-19 misinformation to images of violence, social media moderators have a lot to handle. How do they do it?

More and more mods are coming forward to to join the legal case against Facebook (EiM #40).

Law Street provides accessible, client-focused legal news designed to inform readers and connect lawyers with the legal needs in their field.

A rare mention of Patreon, which is facing a backlash from creators for banning a prominent artists.

Patreon may be cracking down on anime porn, artist warns

Waero, a popular anime porn artist on Patreon, warned his hentai artwork has been removed on Patreon under a new heavy-handed policy


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.