This week saw the first of the big scoops about content moderation of 2019, an expose by Casey Newton about the difficult conditions under which Facebook moderators in Phoenix work. On Tuesday he revealed in his newsletter that 700,000 people had read the story, which is substantial for any investigation, let alone one done by a still young tech site. It tells me that people are starting to care about what happens in the pipes after they hit ‘post’.
Thanks for reading — BW
Is it time for moderators to organise?
In 1831, trouble flared up quickly in a small town in south Wales.
William Crawshay, one of the two main employers in Merthyr Tydfil, had reduced the wages of his ironworkers. A parliamentary act designed to stop the likes of Crawshay changing how people were paid without negotiation had also just been defeated in the House of Lords. And people were angry. They wanted better wages and more jobs.
They marched through the streets of the town with banners soaked in cows’ blood, retook goods that had been seized by debt collectors and persuaded men in the local mines to stop working. They marched to Castle Inn to demand that local magistrates reduce the price of bread only for Crawshay and another employer, standing on the balcony, to reject their demands. Four days later, the uprising was quelled and the army took over but it was a protest that would lay the foundations for many others to do the same and form unions.
I thought about The Merthyr Rising, something I was briefly introduced to during school, as Casey's story about the Phoenix moderators broke this week. I thought about the low salary ($28,800 or £21,739 ) and the high stakes (you can be fired for making just a few errors in a week). I noted the working conditions (nine minutes of wellness time and queues to go to the toilet) and the trauma (PSTD-like symptoms with little support). And there was the fact that one man (Mark Zuckerberg, arguably a modern-day William Crawshay) had it within his gift to change the status quo but continued to do anything about it.
Newton's story adds to momentum that's building around the idea that working conditions need to improve for content moderators across the globe. Robyn Caplan, a researcher at Data and Society, has argued that ‘better labor standards’ is within the tech companies’ power to change and can be done quickly. YouTube last year said they’d limit moderators work to 4 hours a day, although the terms of that contract weren’t made clear. Reuters also found this week that moderators at Genpact, a Facebook outsourcing partner based in Hyderabad, India, are paid $6 a day and took work home to meet targets.
Would moderators go as far as unionising to make it happen? Lizzie O’Shea, the Australian human rights lawyer and broadcaster, made the case in 2017 that unions are important as ‘technology capitalism heavyweights [look]... to impose themselves on other segments of society’. A lot has happened in the 18 months since then and, arguably, the case for content mods to self-organise or join the likes of the Tech Workers Coalition is stronger than ever.
If they decided to, they'd have scale in their favour. There's some 30,000 people working on trust and safety alone at Facebook (many outsourced) and over 10,000 doing the same at YouTube (For context, there are 21 unions in the UK with fewer than 40,000 members). But, with most tech companies based in the US, where trade unions tend to be weak, it's unclear where a central authority would lie.
For now, Facebook have said that they will complete site visits of outsourced centres and complete contract reviews with those third parties. They will also host a summit of their vendors in April as a way of telling the world's media that the conditions of their workers isn't so bad. But we know differently.
The UK's own NetzDG?
The UK government is ramping up the rhetoric around tech regulation. In an interview with Business Insider, digital minister Margot James gave the most detail so far about how they might fine tech companies that host harmful content. (Thanks to Rob for sending this over)
The proposed regulator will draw on Germany’s NetzDG hate speech law, which I wrote about in January and has been considered to have been a moderate success, but is expected to also cover grooming and self-harm. The full policy paper will come out next month so more will be revealed then.
YouTube’s difficult week and advertising exodus (discussed in last week’s EiM) has led to a policy change which mean that comments will be off by default on videos of children (Thanks to Steve for flagging)
The streaming platform is disabling comments on videos of children after paedophiles left predatory comments.
Mike Masnick from Techdirt has also written thoughtfully about on last week’s YouTube controversy.
TikTok launched a new educational video series about their community guidelines and setting up messaging and commenting controls. Good to see education used in conjunction with moderation.
Matt Haughey, who ran Metafilter, says that he had 3,000 comments a day on his popular Reddit-style site and it was hard enough
This morning I read Casey Newton's expose of Facebook moderation problems at the Verge. Let me be clear upfront: content moderation is tough and I have no idea how to solve it at internet scale—in fact I'm not even sure it's possible to do on the orders of millions and billions of items to be…
Rotten Tomatoes turned off the ability to make comments before a movie is released, to avoid people bombing an upcoming film. It's a weirdly unpopular decision
Today we are excited to start our Product update blog, which will allow us to keep you informed of changes and updates on Rotten Tomatoes.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.