📌 A cheatsheet for content regulation, disabling dislike count and TikTok moderator talks
Hello and welcome to Everything in Moderation, your content moderation compilation, delivered every Friday. It's curated and written by me, Ben Whitelaw.
This week contains the usual mix of takedown drama, well-meaning platform announcements and people attempting to effect change under difficult circumstances. In some ways, it is an all too familiar set of challenges and, yet there are bright spots — notably the work of Carnegie UK and the testimony of Gadear Ayed — that provide me with some hope.
To the small busload of new subscribers and to everyone who regularly reads, I hope it's a useful round-up — BW
📜 Policies - emerging speech regulation and legislation
With their long history of working together (until recently at least) and the short distance between London and Brussels, you might have expected the UK and the European Commission to have traded notes on how they plan to regulate United States' tech platforms. But, as this piece by Politico reveals, there's been nothing of the sort. Caroline Dinenage, the British minister that oversaw the Online Safety Bill until her sacking in September, didn't have any dealings with the EU in over 19 months (!), leading to suspicion and scepticism on both sides.
There's also good detail on the UK delegation — including MP Damian Collins (EiM #129) — that visited EU officials on Monday to find out more about the Digital Services Act. That would have been a surefire sign of progress if only senior Brussels officials weren't already scheduled to watch Frances Haugen's testimony to the European Parliament. Expect the frosty relationship to continue.
On the topic of the UK regulation, Carnegie UK has warned that the Online Safety Bill is "too complex to be good regulation" and has offered a set of amendments that "simplify and strengthen" it as it goes through pre-legislative scrutiny. Among other things, its paper recommends a renewed focus on the role of platforms in the "outcome of harm" rather than simply content takedown and specific risk assessments for freedom of expression and privacy. For all the negativity around the Bill, this feels like progress.
If, like me, you're struggling to keep up to speed with the raft of online speech regulation bills in play across the world, Reset Tech has produced a useful summary hidden in its Online Safety Bill written evidence. Worth bookmarking. (Thanks to Daphne for spotting).
💡 Products - the features and functionality shaping speech
The dislike count on all YouTube videos will be hidden as part of the company's effort to prevent creators from being harassed, it was announced this week. Research by the platform found that the count served to facilitate "dislike attacks" and disproportionately affected smaller creators. For those of you who have a YouTube channel and want to find out how many dislikes you're getting on your DIY tutorials or FIFA 'incredible goals!' compilations, you can still find out via YouTube Studio.
💬 Platforms - efforts to enforce company guidelines
Indiegogo will review all campaigns before they go live on the platform in a new attempt to avoid people scamming and defrauding backers out of their money. The move was announced in a company blog post by Will Haines, VP of Product and Customer Trust, which also revealed the company had created an Internal Review Board to oversee more egregious Trust and Safety decisions. It will also partner with GoFundMe to create the Crowdfunding Trust Alliance to share knowledge and best practices (although not yet with Kickstarter, the biggest crowdfunding platform).
Twitter has locked a Canadian professor's account for posting that Justin Trudeau should be "tarred and feathered" — a medieval retaliation method that involved culprits having pine tar and bird plumage thrown over them — for the government's slow response to vaccinating children. Amir Attaran was banned last week on the grounds of abuse and harassment and must delete the offending tweet before his account is unlocked. For the sheer madness of this scenario, this is my read of the week.
Platforms like Parler and Gab can be understood as "an attempt to return back to the libertarian politics of early social media" when "more stringent moderation policies" didn't exist. So argues Simon Copland in a blog post for Global Network on Extremism and Technology, which also has some (in internet terms) almost prehistoric quotes from Reddit co-founder Alex Ohanian.
👥 People - folks changing the future of moderation
Most moderators that have spoken out against platform policies were employed by Facebook, the latest being Isabella Plunkett (EiM #112) back in May. This isn't particularly surprising because of the harm the company has caused but also because Facebook has long employed the largest (but still not big enough) content review team on the planet. The more people that have worked there, the more likely we are to see ex-staff go public when they leave.
As other companies scale their moderation operations, they too risk former employees, like Gadear Ayed, speaking out about what they experienced. Ayed, who is Iraqi, worked as a moderator at TikTok for around six months from December 2020 and spoke this week to ABC about her experience, which is sadly all too familiar. She explains how she was:
- Part of the 50-person strong 'Israel team' based in London (some 3000 miles from Tel Aviv) and overseeing content from both Israel and the Palestinian territories.
- Asked to categorise and remove Palestinian resistance groups and Israeli opposition groups for "terrorism".
- Told not to remove videos of women being dragged along the street during the Sheik Jarrah protests because "there's not enough brutality".
We learn more about the moderation processes of the large platforms from the accounts of ex-employees than from anything else. Although they might not get the media coverage of Frances Haugen's testimonies, they are vital and need reporting more than ever.
Bonus read: I wrote in last week's newsletter (EiM #135) about the ever-shifting situation in Israel and Palestine when it comes to platform policy. Worth catching up on if you missed it.
🐦 Tweets of note
- "everything is a content moderation issue" - Evelyn Douek on Spotify's decision to use Travis Scott as the visual on two of its biggest playlists.
- "lost on appeal because teens can't handle seeing the brutal reality of the civil rights movement" - Micah Gelman, the Washington Post's head of video, responds to an unlikely video being taken down from YouTube.
- "a lot of it's just.... boring technical shit, or incredibly graphic descriptions/pics of gore/murder from the folks on the content mod side" - Gizmodo reporter Shoshana Wodinsky on the realities of sifting through Frances Haugen's leaked documents.