4 min read

📌 A driving test but for the web

The week in content moderation - edition #19

Well, hello there. What a mad couple of weeks it’s been. First to Brussels to attempt to (kind of) sort out Brexit (unsuccessful) and then to Birmingham to bring together some cool community-led news organisations (more successful).

A ton of stories related to moderation and online speech have been published since I last was in touch so I’ve included an especially comprehensive ‘Not forgetting..’ section this week.

Thanks for reading — BW

A driving test for the web?

In the 1890s, when imported cars were newly introduced in the UK, there was no driving test and no such thing as a licence. Anyone over the age of 17 could get behind a wheel.

By the 1930s, more than 2 million motor vehicles existed on roads. Accidents were very common - there were over 7,000 deaths and 230,000 injuries in 1934 alone.

That led to the introduction of the Highway Code, which urged road users to be careful and considerate towards each other. People were tested and fatalities, unsurprisingly, decreased.

Driving tests followed soon after, in 1935. It wasn’t a popular political decision (motorhead MPs insisted people were just 'getting used to conditions’) but it reduced accidents further, as did other measures, such as 30mph limits in urban areas and ‘L plates’ for new drivers.

Later on, in 1996, a separate theory test replaced the questions examiners asked from the Highway Code during the practical test. Sixteen years later, another section, hazard perception, was added to respond to the changing nature of the roads.

What does this have to do with content moderation? Well, it feels like we’re in the pre-driving licence stage of the web. Right now, anyone can take the proverbial wheel and publish or accees whatever they want, regardless of their capability or proficiency for doing so (although some restrictions are starting to be put in place).

As with cars in the 1930s, that decision is causing countless accidents - mass trolling, child grooming, the rise of nationalism, child bullying and screentime addiction to name a few. People online are being knocked down by a small proportion of people using the web in a way that’s not safe and hard to stop.

Obviously, legislation and regulation are on their way in various guises across both sides of the Atlantic which, if the Road Traffic Act is anything to go by, should help. And yet no one is talking about driving proficiency through online testing or the fact that governments can raise the collective standard of web users through certification and training. That feels like a missed opportunity.

Back in 2000, two academics sought to explore the idea of a mandatory computer proficiency exam to understand if people could perform tasks like file management, email and browsing and to see how their skills compared to to their own idea of how good they were at these things.

It didn’t catch on but something similar wouldn’t go amiss nowadays.

Moderation as a differentiator

You may know about the big battle going on in the gaming industry between two competitors, Valve (makers of game store Steam) and Epic Games (creators of the Epic Games Store). In essence, the world’s dominant PC game marketplace (Valve) is being challenged by a new kid on the block.

Nothing too noteworthy from our point of views. Except that the CEO of Epic Games, Tim Sweeney, is making moderation of games that appear on his store a differentiator. Talking to The Verge, he said ‘we’ll turn down crappy games’ and 'won’t accept pornographic or shock content of any kind’. A very different approach to the way Steam recently prevaricated for days about banning a game about rape.

Could this approach work for other businesses? Would news media attract more subscribers by cultivating a healthy environment for debate rather than letting commenters run amok? I expect they may try if Sweeney and Epic pull it off.

Not forgetting...

What is there left to say about the Christchurch massacre last week? It feels wrong to make it at all about moderation but it’s been one of the strands of discussion since it happened. This NYT editorial best sums up the hopelessness I felt about the attack and how it played out online

The Attack That Broke the Net’s Safety Net - The New York Times

A killer determined to make terrorism go viral beat a system designed to keep the worst of the web out of sight.

Casey Newton spoke to Facebook’s former security chief Alex Stamos about content moderation for the SXSW edition of The Verge's podcast

Facebook’s former chief security officer Alex Stamos on protecting content moderators - The Verge

Former Facebook chief security officer Alex Stamos and Casey Newton discuss protecting content moderators and the difficult issues that plague Facebook and democracy

Russia will cut off people from the web at some time in April in order to test a new self-contained web following the passing of laws that ban what they call 'fake news’ but is actually nothing of the sort

Russia moves to control online speech with new law banning "fake news" and "disrespect"

Two new Russian laws ban the intentional spread of "fake news" and language disrespecting the government, but critics say it violates free speech.

The folks at Jigsaw have released a Chrome extension on top of their Perspective API to allow people to turn down the toxicity of comments they see. (I’ve installed it and will report back)

Google's "Tune" Chrome Extension Lets You Control Toxic Comments • Google • WeRSM - We are Social Media

Google's Jigsaw developed Tune, a new experimental Chrome Extension to help internet users control the toxic comments they see on popular platforms.

So much of the reporting about content moderation is about the challenges rather than solutions that I took heart from this piece by PhD candidate Bertie Vigden from the University of Oxford about what can be done

Four ways social media platforms could stop the spread of hateful content in aftermath of terror attacks

What can social media platforms do after terrorist attacks?

A couple of weeks old now but Ryan Broderick on why comment moderators are vastly important is this newsletter summed up in one piece

The Comment Moderator Is The Most Important Job In The World Right Now

Online platforms continue to absorb more and more of our society, while the companies in charge neglect the human beings tasked with cleaning up the mess that’s left behind.

The Cleaners, the documentary I’ve mentioned in previous editions of EiM, has been released on iPlayer in the UK and has had good reviews. Definitely watch it

The Internet’s Dirtiest Secrets review – the human toll of detoxifying social media

A masterful edition of Storyville exposed the awful plight of the moderators tasked with purging tech platforms of violent and sexually abusive images

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.