3 min read

📌 How to spin YouTube's conspiracy problem

The week in content moderation - edition #60

Welcome to another Everything in Moderation — I hope you’re all channelling the spirit of Captain Tom Moore in these tough times.

A special welcome to new subscribers who found me via the excellent Slice Frames newsletter, including smart folks from The Bristol Cable,  Malaysiakini and The School of Slow Media.

Drop me a note to say hi if you haven’t already (calls are also welcome). And, if you know other folks that might like EiM, you can send them here.

This week, I take a look at YouTube vs conspiracy theories.

Thanks for reading and stay safe — BW

🤔 Borderline content, straight from the CEO

Susan Wojcicki doesn’t shy away from being interviewed, nor does she mind talking about content moderation.

The CEO of YouTube sat down in front of a large crowd at Code 2019 just after the Carlos Maza/Stephen Crowder dispute and has been interviewed by The Guardian, CBS News and even Alfie Deyes in the few months since then.

In her latest one-on-one, this time with NBC News’ Dylan Byers, she addresses the video-sharing site’s ongoing struggle to reduce borderline content (you can listen here as a podcast).

One thing struck me as I read the transcript: Wojcicki uses a paper by academics at Berkeley to claim that YouTube has made making good progress tackling conspiracy theories on the platform.

She explains:

"..there was a study that just came out in the last couple of days by the Berkeley researchers. And, you know, they looked at some of the progress that we've made about borderline content. And, like, you know, they said, like, we're doing a much better job with recommendations. They of course also have areas that they work we could do better. And we will continue to do better."

The study in question is 'A longitudinal analysis of YouTube’s promotion of conspiracy videos', authored by Marc Faddoul, Guillaume Chaslot, and Hany Farid, who together analysed more than 8 million YouTube recommendations from 1000 of the most popular news and informational channels over 15 months.

The paper's conclusions weren’t exactly positive, though. As the New York Times explains:

“the Berkeley researchers found that just after YouTube announced that success, its recommendations of conspiracy theories jumped back up and then fluctuated over the next several months. The data also showed that other falsehoods continued to flourish in YouTube’s recommendations, like claims that aliens created the pyramids, that the government is hiding secret technologies and that climate change is a lie.”

‘A much better job with recommendations’? Hardly. But then again, there is increasing evidence of misinformation being a top-down phenomena so perhaps we shouldn’t be surprised by Susan's slip.

📣 Talking of transparency..

If you've been reading EiM for a while, you might remember me mentioning the Santa Clara Principles, a set of guidelines designed to help social media platforms be more transparent about content they take down.

Well, the Electronic Frontier Foundation (EFF) is  seeking to expand those standards and has announced a call for feedback. Submit your thoughts here by 30 June 2020.

🏨 Public health platforms? (week 6)

I can recommend each of these very good analyses on COVID-19:

⏳ Not forgetting...

StackOverflow has released a new version of its unfriendliness text-checking bot — UR-V2 — to great effect. Read this article and the original post by two of its developers.

Stack Overflow banishes belligerent blather with bespoke bot – but will it work? • The Register

Conflict monitoring organisations like Airwars and the Syrian Archive are significantly affected by page takedowns, according to this Time article.

These Tech Companies Managed to Eradicate ISIS Content. But They're Also Erasing Crucial Evidence of War Crimes

Facebook and YouTube designed algorithms to detect extremist content. But they're hurting citizen journalists and human rights groups, too

I don’t use Twitch but I’m quietly addicted to news about the banning and reintroduction of its big name users. This week: Anisa Jomha, streamer and girlfriend of iDubbz.

iDubbbz’s girlfriend banned on Twitch following controversy | Dexerto.com

iDubbbz's girlfriend Anisa Jomha has been banned on Twitch following the YouTuber's recent drama, which sparked controversy online.

I haven’t got round to watching it all but NYC Media Lab this week hosted an interesting-looking online panel about Section 230.

Section 230 Revisited: Web Freedom vs Accountability

In a time of declining trust and confidence in large technology firms - there is growing bipartisan agreement that it’s time for legislative change.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.