I had a few hours between swimming and eating fish and chips while in Cornwall to put together this link-heavy round-up of this week’s content moderation stories.
Also I’m really chuffed to have a longtime Twitter pal, Roberto, as a new subscriber to EiM plus folks from Travelfish and Medium.
As ever, thanks for reading — BW
PS Apologies for the mistakes that crept into last week’s EiM. Lesson: don’t rely on patchy Great Western Rail wifi to get a newsletter out on time
Revenue goals ≠ moderation goals
The news that YouTube stars get an easy ride when accused of breaking the site’s community guidelines (reported by the Washington Post this week) won’t have surprised anyone. Nor will the fact that ‘higher-ups’ in the organisation regularly quashed recommendations to strip revenue-generating rights from likes of Logan Paul and Stephen Crowder.
This kind of internal escalation rarely works because organisational politics and money get in the way of the best outcome. We need more radical arbitration solutions (and I don't just mean a supreme court, either). What are yours?
The Audio Visual Media Service directive, under consultation in the UK until next week, will force platforms to vet visual content in the same way as broadcasters if it comes into play in September 2020. Digiday’s Lucinda Southern explains more
For regulations, the U.K. has appointed Ofcom to monitor video-sharing platforms, ensuring they don't post harmful content.
A draft executive order threatens to give the Federal Communications Commission responsibility for overseeing social networks and their moderation policies. This Wired piece puts it into a broader context
Mass shootings and executive orders have dragged the web's most consequential law back into the spotlight.
Tweets by African Americans are significantly more likely to be flagged as ‘offensive’ by an automated system, according to a new paper
A new study shows that leading AI models are 1.5 times more likely to flag tweets written by African Americans as "offensive" compared to other tweets.
YouTube might have to provide more clarity around their moderation and ad-placement algorithms if a case by three LGBT creators goes to court
YouTube denies it discriminates on the basis of sexual orientation or gender identity.
iFunny, the meme-sharing website, has all the ingredients to become the place from which the next mass shooter emerges. Hands off founder/leadership team + volunteer moderators + difficult to moderate, image-based content.
An extremist subculture festers unchecked on a meme-sharing site popular with young white men.
A gruesome tale of how a young Indian man’s TikTok video went viral, led to him strangling his friend and what it all shows about the video platforms moderation challenges
In small-town India, TikTok videos are a new hate speech minefield. Thousands of videos are being removed as users exploit the wild growth of the social netwo
Hussein Kesvani at MEL magazine reflects on his time using 4chan as a teenager and how he got caught up in its toxic culture
After El Paso, a generation of early 4chan users and lurkers are once again attempting to reconcile what they’ve wrought
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.