3 min read

📌 Chinese rules, applied globally

The week in content moderation - edition #36

Hello everyone. It's Friday (great) and I'm arriving late into your inboxes (not so great).

Each week, I try to flag unmissable stuff up top (often podcasts, occasionally videos) and this week I want to share this reading list compiled by the Social Media Collective (a network of moderation rockstar academics). It’s extensive, almost overwhelming, but hugely useful. Bookmark it now.

Thanks for reading — BW


The ugly side of TikTok

Establishing a sensible, coherent approach to content moderation on the web was always going to be tough. But it’s an altogether different proposition when you factor in TikTok.

This hit home after reading recent scoops from the Guardian’s Alex Hern. If you didn't catch it, the Chinese-owned social network routinely downranks and deletes videos featuring political content, such as Tiananmen Square, and pro-LGBT content, even in countries where being gay is not illegal (eg Turkey). It's essentially applying Chinese standards globally.

TikTok insisted in a statement that it retired these guidelines in May (perhaps on the advice of its newly in post Director of Global Public Policy) but it is anyone’s guess what rules are being applied to content on the platform in any given country.

The company, headquartered in Beijing, also does not have a good record of being particularly transparent or of responding to criticism well (see the case of YouTuber PaymoneyWubby). That’s unlikely to change anytime soon.

And so, just as one social media giant starts to open itself up to scrutiny, another one emerges with politicised policies and opaque processes but with no news media to hold it account or force it to change.

When it comes to TikTok, it's hard to be hopeful.

It’s all semantics

In a Twitter thread, Andreessen Horowitz’s Benedict Evans suggests good moderation relies on good definitions. And definitions are hard.

Not forgetting...

I came close to writing about this story this week but sometimes there’s only so much Facebook one can stand. Even so, Big Blue's former head of content standards railing against the recent politicians/newsworthiness ruling is worth a read.

The Guy Who Wrote Facebook's Content Rules Says Its Politician Hate Speech Exemption Is 'Cowardice' | WIRED

Dave Willner helped put together Facebook's content standards over a decade ago. He's not happy with the company's exceptions for politicians.

An in-depth read via Vice on how Facebook doubled down on intercepting terrorist content on its platform after the Christchurch attack.

Facebook Went to War Against White Supremacist Terror After Christchurch. Will It Work? - VICE

Facebook’s 350-person counterterrorism team is retraining its tools for far-right meme culture.

Further to last week’s EiM, a female Twitch user lost her stall at Twitchcon because some dudes maliciously flagged it as sexually explicit activity.

On Twitch, women who stream say their biggest obstacle is harassment

At TwitchCon, a dozen women interviewed said they had all experienced harassment on the streaming platform Twitch.

Folks, you really know what's what: Twitch's CEO believes content moderation is 'the issue of our time’. (Has anyone got his email addy? I think he’d like EiM)

Twitch's Emmett Shear on Streaming Talent Wars, Moderation Plans | Hollywood Reporter

Twitch CEO Emmett Shear talks about his plans to grow the platform's user base, noting to THR, "Surprisingly, not everybody knows about Twitch yet."

This is kinda fun. Cyber lawyer Daphne Keller had her husband flag a picture that she posted on Facebook of a naked Rodin statue. You can guess what happened.

That time my husband reported me to the Facebook police: a case study / Boing Boing

Stanford's Daphne Keller brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale…

Reddit’s COO Jen Wong talks to Yahoo about the company’s 'layered approach’ to moderation

How Reddit avoids content moderation woes of Facebook, Twitter and YouTube

Hate speech, conspiracy theories, and bad content have proliferated Reddit since it was founded 14 years ago. It has a massive reach of 330 million monthly active users yet manages to stay out of the intense spotlight shining on its publicly traded competitors.


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.