3 min read

📌 Optimising for eye contact

The week in content moderation - edition #22

Hello and welcome to this week's EiM.

If you’re in Canada at the end of May, 1) lucky you and 2) you may want to make a beeline for Toronto to take part in this cool two-day interdisciplinary symposium titled ‘what do we mean when we say content moderation?’ with a keynote from EFF's Jillian C. York, who writes thoughtfully about online privacy and people’s right to create. (If you are already going or end up attending, do let me know).

While we're on the topic, are there any content moderation conferences or events that you're going to this year? Share them with me and I'll include them in next week's email.

Thanks for reading — BW


Better in the flesh

Online hostility doesn’t happen in a vacuum. We know that physical feedback mechanisms, like eye contact, reduce antagonism. Body language is a powerful tool. Computer-mediated communication, to use academic parlance, is not only less fun but means you search less for common understanding (2017 thesis). And so it makes sense to note the trend for stepping away from the screen and discussing difficult, complex topics face-to-face.

This week, Reach PLC, one of the UK’s major media houses, launched Britain Talks, which they dub ‘a plan to bring the nation back together over a nice cup of tea’. Their starting point, setting aside the grandiose idea that a couple of national newspapers with circulations less than 500k can fix the UK, is that convening people for a hot drink and a lively and challenging discussion without shouting at each other is something unique and worth striving for.

It’s worked in other places. In Germany, Zeit Online organised face-to-face meetups between 1,200 people from across the political spectrum. Following the success of that, a host of publishers across Europe, including the FT, La Repubblica and HuffPost, decided to set up Europe Talks in March this year, for conversations specifically about European identity and politics.

So, if we know in-person discussion is fruitful, how can we prove its link to having productive online conversations too? More research is certainly needed but perhaps the answer perhaps lies in what Eve Pearlman calls ‘repatterning away from the reflexive name-calling so entrenched in our discourse that people on all sides don’t notice it anymore.’ (Her recent TED talk on the work of her company, Spaceship Media, is worth watching). Maybe face-to-face discussion (or a well-moderated online one, though they are harder to come by) can act as an intervention, a cycle breaker that resets our online habits that tend towards antagonism. Eve’s work to convene Trump and Clinton supporters in a Facebook Group certainly showed that it’s possible.

D-day for Europe Talks is 11th May, which when all the participants get together to listen and to swap ideas and, you hope, become more empathetic of one another. But we’re really just at the start of experimenting how online and offline dovetail together.

The second wave reaction

Increasingly, the reaction to a platform policy or guideline change comes in two waves.

1. Straight report about what happened, from a tech reporter with large online following, often with a tech/privacy/free speech angle

2. Piece on the implication for niche or underrepresented online communities, on a site with smaller reach, a few days after the news cycle

This was certainly the case with news about Instagram’s community guidelines changes to demote ‘inappropriate content’  (last week’s EiM). This week has seen a second tranche of stories about fans of dark art, sex workers and models who rely on the platform for promotion and referrals but who now have been left in limbo.

The learning: be sure to keep an eye out for the second wave of stories. It's always more important than the first.

Not forgetting...

One of the big stories this week. Twitter reinstated a bunch of tweets erroneously taken down but how it happened is still unclear...

EFF’s Tweet About an Overzealous DMCA Takedown Is Now Subject to an Overzealous Takedown

Get ready for a tale as good as anything you’d see on television. Here’s the sequence of events

Mike Masnick at Techdirt debunks a Guardian journalist’s Twitter thread on YouTube moderation

No, YouTube Cannot Reasonably Moderate All Content On Its Platform

A detailed report of how hate speech is dealt with around the world from the Council of Foreign Relations

Hate Speech on Social Media: Global Comparisons

Violence attributed to online hate speech has increased worldwide. Societies confronting the trend must deal with questions of free speech and censorship on widely used tech platforms.

Dr Jennifer Beckett, who is researching moderators’ wellbeing, gives some tips to make moderators lives easier. More solutions-focused reporting like this wouldn’t go amiss

What it's like being an online moderator (and how to make their lives easier) - ABC Life

Online content moderators play a key role in shaping important conversations online and helping us all feel safe, but they deal with a lot of abuse. Here's how we can all make their lives that little bit easier.


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.