3 min read

📌 Quarantine is the new bozo

The week in content moderation - edition #28

This is my first update in a few weeks as I’ve been in a field in Somerset, UK, having almost as much fun as this guy. With lots to catch up on, I've included a longer than usual set of links.

I'm also really pleased to welcome new subscribers from The Sun, Condé Nast, Kings College London and a kind colleague at the European Journalism Centre.

Here for a podcast recommendation? Listen to Sarah T Roberts, author of new book Behind the Screen and a professor at UCLA, talk about the challenges of moderation at scale on the latest edition of the Vergecast.

Thanks for reading — BW

Bye-bye bozo, hello quarantine

In my job working on comment moderation at The Times, bozoing was one of the few ways to deal with users who intentionally tried to disrupt the conversion without necessarily violating the community guidelines. I never liked using it. But I still used it.

Bozoing a comment meant only the user that posted it could see it. It also ensured that it couldn’t get any replies or likes and thus starved the user of the feedback they desperately sought. Used judiciously, my team and I decided it was a good way to preserve the conversation for others.

I thought about my days bozoing comments this week as Twitter announced that offensive tweets from serving government officials and deemed to violate its rules will be quarantined — that is, they will be hidden behind an opt-in screen that, on click or tap, displays it. Tweets that are quarantined will receive less reach across the platform and won’t appear in searches. (At this stage, quarantining only affects verified public officials with more than 100,000 followers).

It's not a new idea — the approach is very similar to Reddit’s feature (which this week was tested on The_Donald subreddit ) and not a million miles away from the spam filters that has kept our inboxes (mostly) clear for decades – but it's simplicity is perhaps its best feature. The Washington Post were very positive:

Shifting toward transparency, with a set of criteria to determine when a tweet is a matter of public interest and how to weigh the implications of removal vs. labelling, is a sensible solution.

As was Wired editor-in-chief Nicholas Thompson:

Any potential downsides (one of which is Twitter's bizarre decision not to tell the user that their tweet has been quarantined) are outweighed by the fact that, finally, moderators have another tool in their armoury against bad actors. As Kat Lo, researcher at UC Irvine and Content Moderation Lead at Meedan, notes:

Until Donald Trump has one of his tweets quarantined, this is.

Not forgetting

A group of vigilante teenagers are moderating TikTok by posting screenshots of creepy messages to Twitter and Instagram

TikTok Has A Predator Problem. A Network Of Young Women Is Fighting Back.

Young users on the wildly popular video app have created an ad hoc system of screenshot leaks and callout videos meant to out abusers and predators.

A very unrealistic (and hopefully soon to be overturned) ruling from Australia regarding social media moderation

After defamation ruling, it's time Facebook provided better moderation tools

Australia's latest defamation ruling has made Facebook publishing a minefield, but there are strategies to ensure better social media outcomes for everyone.

Facebook #1: An audit of Facebook by over 90 civil rights organisations has said that it hasn’t gone far enough in banning white nationalism.

Facebook's First 'Civil Rights Audit' Is the First Step in Climbing Everest - VICE

Facebook's Civil Rights Audit, published on Sunday, recommends the platform also ban implicit forms of white nationalism.

Facebook #2: An investigation by ProPublica into hate speech in Custom and Border Agents private groups suggests they’re not under the same moderation oversight as public posts

Civil Rights Groups Have Been Warning Facebook About Hate Speech In Secret Groups For Years — ProPublica

Facebook says its rules prohibit hate in secret groups, but it won’t discuss how it moderated the offensive Border Patrol posts — if it did anything at all.

Related: Matthew Ingram at Columbia Journalism Review talks about why its different from conversations happening in email threads

Facebook and the private group problem - Columbia Journalism Review

It’s taken a while but perhaps there’s hope for the internet now that God has intervened

Church of England announces new 10 commandments for social media

The Church of England will issue a set of social media commandments to combat "cynicism and abuse" online.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.