3 min read

📌 'One Health' for platforms, Oversight Board news and Trump takes aim

The week in content moderation - edition #90

Welcome to Everything in Moderation, your weekly newsletter about content moderation written in blind faith by me, Ben Whitelaw.

A particular welcome to a flurry of new subscribers over the past week. If you have time, drop me a line and say hi/let me know what you think of EiM.

PS: I’m taking a few days off this week so today's newsletter was put together on Wednesday afternoon — BW


📜 Policies - company guidelines and speech regulation

The Covid-19 pandemic has led more people (including me) to know about One Health, the concept that humankind is inextricably linked to the environment and the animals within it. It advocates for a holistic view of health and for addressing systemic problems, rather than piecemeal ones.

According to a recent article by Heidi Tworek, associate professor of public policy and international history at British Columbia University, there is a lot that content moderation can learn from the idea. Like diseases, hate speech and other types of undesirable information do not stop at borders and, as Tworek notes, 'any of the problems now coming to roost in North America and Europe have happened elsewhere’, notably Myanmar and the Philippines.

With the approval and take-up of vaccinations likely to be a big story over the coming months and a large headache for the dominant digital platforms, there’s more scope for this kind of thinking. I’m pleased to say that the UK government has agreed with the major platforms that anti-vaccine information should be removed ‘more swiftly'. Although that won’t be enough if it’s just the UK, and no other country, that takes action.

💡 Products - features and functionality

In a parallel universe, I’d have loved to have studied computer science so these slides from Amy X Zhang, assistant professor at the Allen School of Computer Science & Engineering, on considerations for building automated moderation models are great.

In them, Amy flags the need to know where training data comes from and the importance of understanding who is doing labelling as well as highlighting tons of great resources and links.

If nothing else, it's reassuring to see that this is how the future creators of moderation product (aka 'moduct') are being taught.

💬 Platforms - dominant digital platforms

Facebook's Oversight Board is kind of, almost, finally, up and running. On Tuesday, it announced the selection of six cases (from over 20,000 put forward) that will be looked into by five-member panels drawn from its board members. Those panels will have to give a decision within 90 days and the decision will have to be acted on by Facebook, whether it likes it or not.

There’s something for everyone: visible nipples, quotes from historical figures, screenshotted tweets; violations of policies on hate speech, adult nudity and violence and incitement. Harvard lecturer Evelyn Douek called it ’the greatest hits of Facebook content moderation controversies’.

Meanwhile, the Real Facebook Oversight Board — the part theatre, part activist group — announced three of its own cases that it will be investigating, including the continued presence of Steve Bannon on its platform. Let's see who does a better job.

👥 People - those shaping the future of content moderation

Call it his last hurrah. Call it the actions of a desperate man. Either way, Donald Trump’s threat to repeal a defence spending bill unless Section 230 — the US law that provides specific protection to online services — is revoked puts content moderation in a whole different realm.

By making the announcement, the outgoing President is tying together the repeal with this year’s National Defense Authorization Act, which grants thousands of US servicemen and women pay rises. Which is not something you want to stand in the way of in the United States.

🐦 Tweets of note

  • “Pinterest is run by bullies and cowards” - Ifeoma Ozoma, who worked on Pinterest policy initiatives I’ve covered in EiM previously (#33, #43), reveals court papers that show the company was a toxic place to work.
  • “Why news is increasingly at the centre of international platform regulation efforts (and why it shouldn’t be)” - Melbourne research fellow James Meese’s paper is now free to access.
  • “All his posts about a high-profile death penalty case had vanished with no notification” - we’ve seen cases like this before but an incident with a freelance journalist with 150,000 Facebook followers has perturbed Rachel Thomas.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.