3 min read

📌 Trump ban referred, ReadUp's reluctance and taking down Tiktok

The week in content moderation - edition #96

Welcome to Everything in Moderation, your weekly newsletter about content moderation patiently watched over by me, Ben Whitelaw.

I'm delighted to welcome new subscribers from The Local, Membership Puzzle Project, Alhudood, Ringier and elsewhere. Good to have you on the team.

Onto this week's content moderation round-up — BW

📜 Policies - company guidelines and speech regulation

The Digitial Services Act has seen an uptick in media coverage lately and this very good Foreign Policy piece makes an interesting point that I had not considered: Emmanuel Macron will be taking over the EU presidency in 2022, the same year that he will seek a second term as France's President. If he can nail Big Tech, he has a lot to gain. Expect other politicians and public figures to be thinking along the same lines.

Russia has reportedly demanded that TikTok videos urging students to join protests in support of Alexei Navalny should be taken down. Russia’s communications watchdog, Roskomnadzor, has said the clips — which have been watched 50m times — could have 'dangerous consequences'. It's not the first time we've seen this: back in March, the regulator leaned heavily on platforms to remove coronavirus “fake news”.

British MPs yesterday grilled representatives from the dominant digital platforms about their role in the Washington riots.  Monika Bickert (Facebook), Nick Pickles (Twitter), Derek Slater (Google) and Theo Bertram (TikTok) were all present but, according to the unusually sassy BBC, 'none had any radical new policies to offer'.

And, although Donald Trump has departed, his cries of political shadowbanning live on in Hungary: this week, Minister of Justice Judit Varga accused Facebook of inhibiting “Christian, conservative, right-wing opinions”. Same old, same old.

💡 Products - features and functionality

ReadUp — the social reading platform where users must have read the article before they contribute — has written about its content moderation policy — or rather, it's lack of one. In the blog, co-founder Bill Loundy notes how he has never had to remove a user contribution because the technology 'obliterates the need for moderation in the first place'. I use ReadUp most mornings — it is slower and more considered than other platforms — but its moderation policy doesn't feel as simple as Bill makes out. I hope I'm wrong.

Former Irish Times editor Conor Brady has raised €600,000 to build out CaliberAI, an Irish start-up building artificial intelligence tools to identify harmful content. Interestingly (at least for me having worked in newsrooms for a decade) he is starting with content moderation lost cause: news outlets.

💬 Platforms - dominant digital platforms

Facebook, as per Evelyn Douek's suggestion, suddenly realised it had an accountability mechanism and referred Donald Trump's indefinite suspension to its Oversight Board. No date has been set for the response and, frankly, it's a no-brainer: if the Board says 'make Donald post again' (unlikely), the threat of WW3 will loom large but it will be on the board member's collective conscience, not Zuckerberg's. If the decision is upheld, the Board worked and Facebook can steer the conversation away from what took it so long to set the Board up in the first place. Easy.

Tim Cook, Apple's CEO, has opened the door for Parler to return to the App Store. In an interview with Fox News that goes to show the importance of moderation for apps, Cook said: "If they get their moderation together, they would be back on there."

👥 People - those shaping the future of content moderation

Mark Weinstein, the CEO of privacy-conscious network MeWe (mentioned in last week's EiM) must have had some sleepless nights of late. 800,000 people —many of them Trump-supporting — signed up in the week ending January 12, leading him to cry 'Have you ever tried to moderate 15 million people?’. It was hard not to be sympathetic.

In this new AP interview, Weinstein tries to assuage fears that his platform is no better than Facebook, mentioning — much like ReadUp — the 'structural design prohibits the amplification' and stating "We have absolutely no censorship for good people who follow our rules".

I'm only somewhat reassured.

🐦 Tweets of note

  • "This is our lived experience every single day online" - lawyer and Oversight Board Member Nighat Dad's thread is filled with hard truths.
  • "Trying to get a sense of the European landscape for harmful content moderation startups. If you know of any please let me know!" - VC Matt Wichrowski can see where the wind is blowing.
  • "I think demanding platforms not algorithmically promote disinfo is important" - junior research assistant Joey Schafer on why regulation alone won't have the intended effect that we seek.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.