3 min read

📌 What platforms did next, China's new laws and a fellow comment reader

The week in content moderation - edition #95

Welcome to Everything in Moderation, your weekly newsletter about content moderation carefully sieved through by me, Ben Whitelaw.

Newcomers from Coadec, Queensland University of Technology, Witness, Taso Advisory, and others — thanks for subscribing.

This week was all about the fallout from the riots in Washington. I feel like I've read hundreds of pieces on content moderation so I've tried to bring you the most important updates at the top of the newsletter (while tweeting some of the best opinion pieces). Your normal EiM follows underneath — BW


🇺🇸 How the platforms dealt with Donald (part 2)

It feels like a lifetime ago now but it's actually less than a week since Twitter called time on US President Donald Trump. And, even this soon afterwards, it's interesting to note the onward chain reaction of one decision by an influential but still private US-based tech company.

First, the big players. Google, Apple and Amazon made moves to prevent the distribution of Parler, where many of Trump's supporters had already congregated and were heading following his suspension. YouTube removed content on Trump's channel and suspended him from uploading content for a minimum of seven days. Airbnb, who rarely have to limit users, started reviewing reservations in Washington and banned users that were involved in the attack on the Capitol while Twitter followed up the ban by making changes to its civic integrity rules and launching a new strikes policy (three for a 12-hour lock, four strikes for a one week ban, five for a permanent ban).

And it wasn't just them; lots of comparatively smaller sites were compelled to act too. Peloton prevented users from using the #StopTheSteal hashtag because it violated its 'hateful, offensive and obscene speech' policy. NextDoor moderators finally were successful in pressuring the neighbourhood network to class QAnon as a terror group. And then there was the founder of MeWe — the privacy-focused site that was suddenly reclaimed by homeless Parler and Gab users — whose exasperated CEO exclaimed: 'Have you ever tried to moderate 15 million people?'. Alex Kantrowitz at One Zero says the rush to do something, anything, is coming for Spotify and Substack too.

Jay Pinho, who writes the Networked newsletter, wrote that the events of last week mean we're all looking at a 'larger, more chaotic, but still, very much ad-hoc amalgamation of disparate content policies forged in the wake of increasingly horrifying behaviour.' That's certainly what it feels like to me.

But at least it will make writing, and hopefully reading, EiM very interesting over the coming months.

📜 Policies - company guidelines and speech regulation

Trump's banning from Twitter didn't just have implications for the platforms: European leaders cleverly seized on the opportunity to push the case for European legislation. German chancellor Angela Merkel — who in 2019 distanced herself from Trump's tweets following posts directed at Democratic congresswomen — called the banning of an elected president by a private tech company 'problematic'.

Some 20 years ago after China published its Regulation on Internet Information Service — which govern how citizens use the web — the nation-state is updating its rules, according to the South China Morning Post. The draft contains a broadened definition of harm, which now includes information that 'disrupts financial markets order. It is open for public feedback until 7 February.

💡 Products - features and functionality

As regular readers of EiM know, I have a soft spot for reader comments on websites (and am against the arbitrary removal of comment spaces). In journalist Sophie Haigney, it seems I have a fellow fan. Haighey has written a piece for The Guardian about how she found solace in the contributions below online recipes during Covid-19 lockdown. She explains:

'I originally came to reading recipe comments in large part for their use value. But now I also find myself scanning them for something else – the glimmers of the personal, perhaps, the funny and sometimes snarky voices of the commenters and the running conversations.'

Me too Sophie, me too.

💬 Platforms - dominant digital platforms

I missed this a few weeks back but it bears mentioning: Facebook, if it doesn't have enough to contend with right now, has been criticised by the largest Armenian non-profit for allowing users on the platform to deny the Armenian Holocaust, which took place in 1915 and killed 1.5m people. The platform banned Holocaust denial in October last year.

👥 People - those shaping the future of content moderation

Although the winds are blowing strongly in favour of strong government regulation of big platforms — particularly in Europe and the US — Australia, curiously, is holding off from doing so.

In comments made this week, its communications minister, Paul Fletcher, made clear that, as private companies, Facebook et al have the right to enforce its terms as its wishes (although the ruling coalition is pushing for greater transparency, as most governments are).

For now, the public consultation on its draft online safety bill, released in December, goes on. Until a controversial takedown or ban forces Fletcher and his fellow ministers into a change of heart.

🐦 Tweets of note

  • 'If white supremacists have the time to build pipe bombs and gallows, let them build their own cloud service.' - Earthseed founder and former Pinterest employee Ifeoma Ozoma with a zinger.
  • 'This has been one of the most momentous days in content moderation' - author and Demos research director Carl Miller sums it up in one tweet.
  • "Hiring humans hurts profitability and contradicts their worldview" - US senator Brian Schatz with a timely reminder to hire real people as mods.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.