3 min read

📌 What makes Pinterest good at content mod?

The week in content moderation - edition #33

Hello folks. I’ve had a busy couple of weeks (including busting my shoulder) so haven’t been able to put together a newsletter the last fortnight.

If you check out nothing else in this week’s edition, watch this 1-minute video of three of Facebook’s senior executives talking about how they email Mark and Sheryl to get their thoughts on policy decisions (such as the Nancy Pelosi video) to ‘make sure they’re ok with it’. I’m flabbergasted.

Thanks for reading — BW

Why Pinterest gets moderation right

Back in 2016, it wasn’t uncommon for articles to appear in journals about the platform’s role in negative body image among teens and the spread of incorrect health information.

Since then, it’s upped its game. As well as hiding false results and blocking URLs of fake sites, Pinterest recently took the step of showing health information from the World Health Organisation (WHO) and other organisations when users search for ‘measles’ and other related terms. A sensible and well-received policy idea.

On top of that, behind the scenes, Pinterest has been building better tools to help its moderators create community and counter misinformation. I didn’t know much at PinQueue until a developer friend (thanks Nick!) stumbled upon this technical write-up of their latest release. And it’s impressive.

There are some nice technical elements — the fact that it's been developed in a way that different teams (eg ads, policy, spam) all use it is a nice touch - but the overall sense from the post is that moderation matters. A great deal of thought has gone into PinQueue (and the policies it supports) since the team began developing it (ironically) back in 2016.

What’s the effect of a piece of software like PinQueue? Well, you'd need to ask the folks that work there and use it every day. But we do know (thanks to a piece that Techdirt carried a piece last year by Adelin Cai from their policy team) that Pinterest have 11 1/2 people (?) moderating content for more than 200m users. Now, event if that's grown a little in the last year, it’s not a patch on the 10,000+ team that a certain other platform has. And no outsourcing either. Basically, moderation nirvana.

Other pieces of software are luckily coming onto the market to fill the need for intuitive moderation tools. Both they and Pinterest’s commitment to building PinQueue are a reminder that we’re only as good as the tools we're given.

Feeling like a tool

A great thread by Kat Lo, content moderation lead at Meedan, on some nifty new Twitch quick commands

Not forgetting…

The Coral Project, the commenting platform developed bought in January by Vox Media, has a name and a bunch of new clients. Congrats to Andrew and the team.

Vox Media Renames, Revamps Coral Comment Moderation Platform

We believe community engagement is first and foremost a question of strategy before technology,” stated Andrew Losowsky, head of Coral.

In Ireland, the next stage of the case brought by five moderators against Facebook for personal injuries caused by disturbing content has been reached. It will now go to the High Court in Dublin (date TBC)

Moderators to take Facebook to court for ‘psychological trauma’

One described witnessing Isis executions, child exploitation and animal torture

Reddit is warning users if the subreddit they are joining has a high post removal rate. Interesting experiment.

Reddit tests warning users about some communities

Hatebase is a database of offensive terms scraped from the web and designed to help organisations combat hate speech. Their latest client? TikTok

Hatebase catalogues the world’s hate speech in real time so you don’t have to – TechCrunch

Hatebase is a company that has made understanding hate speech its primary mission, and it provides that understanding as a service — an increasingly valuable one.

Facebook somewhat cynically used World Suicide Prevention Day to announce that self-harm images will become harder to find in search and not recommended to users

Tightening Our Policies and Expanding Resources to Prevent Suicide and Self-Harm | Facebook Newsroom

A series of small tests on two Facebook pages in Mali by the NGO RNW media found that moderation increased the number of comments and that fewer comments were phatic (one word) replies.

Using moderation to build communities - RNW Media

The editor of the Arizona Cardinals SB Nation site ‘Revenge of the Birds’ went in heavy on fans using articles to vent. 'The boards are an opinion, not a right'. Amen.

A reminder from the editor - Revenge of the Birds

After one game it is pretty safe to say that the fans that want to see Kyler Murray fail over something he had no control over continue to be...

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.