📌 How to shift the blame for toxic content
Hello everyone, especially subscribers across the beautiful continent that is Europe. It’s my birthday tomorrow and, really, #BrexitDay is the worst possible gift.
There have been two big moderation stories over the past week — PTSD disclaimers and fresh Oversight Board news — so I’ve focused on those in case you missed them this week.
Thanks for reading — BW
The big platform blame game
The two big stories in content moderation this week were similar than we realised. Let me try and explain.
Last Friday, just after last week’s EiM newsletter hit your inboxes, the news broke that Accenture, the multinational professional services company, had forced moderators in the US to sign forms acknowledging that the job could cause post-traumatic stress disorder (PTSD).
Verge reporter Casey Newton, who broke the story called it:
the most explicit acknowledgement yet from a content moderation company that the job now being done by tens of thousands of people around the world can come with severe mental health consequences.
The Financial Times later confirmed that it was also happening in Accenture's European offices as well.
I’ve covered the debate about the mental health impact of moderation over the last 18 months (EiM #6, #EiM 40). And so the quote from attorney Hugh Baran, who reviewed the Accenture documents, stood out. He said:
“[These companies are] trying to shift the blame onto workers and make them think that there’s something wrong in their own behaviour that is making them get injured.”
Hold that in your mind for a second.
On Tuesday, Facebook published an announcement about the bylaws and timelines that will govern the day-to-day operations of its 40-strong Oversight Board (I won’t go into detail here but you can read more via Vox).
The announcement, by Brent Harris, Facebook’s Director of Governance and Global Affairs, made me go back to Baran’s quote about blame and who actually owns responsibility for a decision. If you swap ‘workers’ for ‘board members’ and ‘injured’ for ‘banned’, his quote is as much a neat description for what the Oversight Board might end up becoming as it is the legal arse-covering at Accenture.
While I’m positive about Facebook providing a new means of challenging the way its community guidelines are enforced, we should also see the Oversight Board for what it is: a shinier, more respectable version of forcing people to take responsibility for problems that they didn't create and shouldn’t have to deal with the consequences of.
Our new Oversight Board overlord
As part of Facebook’s announcement this week, it also unveiled Thomas Hughes, former executive director of Article 19, as the new man in charge. He's got a tough job ahead, we can all agree that, but I’m mainly worried about the fact that the British human rights expert doesn’t seen to have a Facebook account. I couldn’t find him on a quick trawl - perhaps one of those anonymous trolls?
Not forgetting...
If you enjoyed last week’s EiM, this Time piece on the outsized effect of Section 230 on free speech and US democracy is for you.
The Growing Threat to Free Speech Online | Time
Politicians and celebrities are lining up to repeal Section 230. That would be hugely detrimental to free speech online.
This report by Coda Story on the Uighur Muslims being banished from TikTok includes an anecdote about how a user was allowed to create a handle in Latin but not Uighur. Grim.
Xinjiang's TikTok wipes away evidence of Uyghur persecution — Coda Follows Up
Activists say Uyghur language is disappearing from the app, while evidence of the government’s crackdown on Muslims has been censored.
My previous employer looks at the legacy of Molly Russell, the British 17-year-old that took her life after viewing online images of self-harm.
The tech giants pushed Molly Russell towards her death. Now she’s changing the digital world
A year ago last week, newspapers first reported the death of 14-year-old Molly Russell, who killed herself in November 2017 after seeing online images relating to self-harm and suicide. In the furore
It seems a bit cheeky for Twitch streamers to ask for CVs from prospective (volunteer) channel moderators. In any case, this one application raised a smile.
Streamer reveals what Twitch mod applications really look like
Twitch streamer Klaatu1 treated the world to the one of the best applications in the world from Dan, who wanted to be a Twitch moderator
This was new to me: Amazon Prime allows user-generated videos on their platform and pays people for how many video views they rack up. Unsurprisingly, there are some weird unmoderated videos floating about.
We watched weird stuff on Amazon Prime so you don’t have to
Nearly two-thirds of the videos on Amazon’s streaming service are user-generated content. Lots of them are ... odd.
Amazon’s Mechanical Turk is often used to train moderation models but, somewhat ironically, is full of content that violates its Acceptable Use Policy.
Amazon's Mechanical Turk: Horror Stories From the Inside
The workers of Mechanical Turk, Amazon’s on-demand micro-task platform, say they have encountered mutilated bodies, graphic videos of botched surgeries, and what appeared to be child pornography.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.