3 min read

📌 The revolving door between regulators and the regulated

The week in content moderation - edition #62

Welcome to this week's Everything in Moderation, one and all. New subscribers from Dangerous Speech Project, the University of Michigan, ilsole24ore.com and The Telegraph, a particularly warm hello. I dearly hope you’re not bots.

I'm struggling to make time for webinars of late but I’m going to get to this Shorenstein Center panel on 'Commercial Content Moderation during the Pandemic' next Friday 8th May (1pm EDT). See you there?

This week’s newsletter is more tapas than main meal. If you have ideas, opinions or links, please do hit reply.

Thanks for reading and stay safe — BW

PS Just as I was pressing schedule on last week’s newsletter about TikTok being racist (EiM #61), it decided to ban Tommy Robinson and Britain First for promoting ‘hateful ideology’. Bad news: I didn’t realise until after it hit your inboxes — sorry about that. Good news: it at least shows EiM is on the pulse.

👽 Close encounters

Last year, it enlisted Clegg. Now Facebook has nabbed Close.

The Times announced this week that Tony Close, the director of content standards at media watchdog Ofcom, had accepted a role at Facebook. He has been put on gardening leave for three months.

(Apologies via Ofcom/Wikimedia)

It’s noteworthy because Ofcom is the proposed regulator of legislation designed to 'make the UK the safest place in the world to be online’. Close, according to the report, is said to have 'been heavily involved with drawing up rules to rein in the tech giants and protect the public’. Facebook gains greatly by having him on their payroll.

The Times also revealed that his move comes a month after another UK government policy expert also moved to the Big Blue. Something of a hiring spree.

Job moves like this are easy to dismiss as unsurprising and not newsworthy. After all, we’ve known since 2016 that many UK and US public officials make the move into policy roles at the tech giants (and occasionally vice versa).

But, with a full response to the Online Harms White Paper due any week now, it might be a good time for news organisations to look again at which government experts have recently updated their LinkedIn job titles.

😖 Hard of hearing

Sticking with the UK political system, I’d like to say that I’m very much with Heather in expressing my exasperation at politicians thinking that anonymity is the root cause of all online harm (see EiM #59).

If you can bear watching it, here's the right honorable Member in question doing his ultimate best to look like he doesn’t know what he’s talking about during yesterday's Digital, Culture, Media and Sport Sub-committee.

Jason Kint, CEO of Digital Context Next, does a good job of rounding-up the main points of the committee in this Twitter thread.

🏥 Public health platforms? (week 8)

Think of this week's links like padrón peppers: salty and delicious.

  • Services like Mechanical Turk and Hive have seen a sharp increase in users and work, including apps seeking help with moderation (Wired)
  • Facebook’s decision to remove anti-quarantine protests that go against the US government's social distancing policy is a political act, argues an attorney (One Zero)
  • FaceyB is also reopening moderation centres in Texas and San Francisco (BBC)
  • I love this piece by Evelyn Douek in The Atlantic about the land grab currently taking place by tech giants and, in particular, the following quote:
As a matter of public health, these moves are entirely prudent. But as a matter of free speech, the platforms’ unconstrained power to change the rules virtually overnight is deeply disconcerting.

⌚️ Not forgetting...

The Telegraph launched new community guidelines and a new moderation team this week. (I hope to have more detail for EiM subscribers in a forthcoming newsletter).

Steam, the $4bn gaming platform, helps to ‘normalize extremist ideology and hatred’ through its passive approach to moderation, according to an Anti-Defence League report.

The ADL Calls Out Steam for Giving Extremists a Pass | WIRED

The nonprofit has identified hundreds of profiles that espouse hate, with little attempt from the gaming platform to stop them.

I’m telling you, Twitch user bans are the new soap opera. This time, an unfortunate nipple slip.

Calls for Alinity to be banned on Twitch following livestream accident | Dexerto.com

Twitch star Alinity Divine has found herself in the public spotlight yet again, after a slip-up on steam.

Sometimes, amidst the news of takedowns and account bans, it’s important to look at the long view and this piece from OpenGlobalRights does that.

Risks and responsibilities of content moderation in online platforms | OpenGlobalRights

The issue of content moderation in online platforms has been missing in debates on business and human rights, but these platforms are critical in exercising our freedom of expression.

Finally, Discord is hiring for a bunch of roles in their Trust and Safety team. Good luck to any potential applicants.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.