3 min read

📌 The cost of being on the front line

The week in content moderation - edition #6

I’m back after a holiday in beautiful Naples and can’t recommend it’s combination of gritty backstreet bars and great pizza enough. Get in touch if you want recommendations.

In the meantime, smart folks at The Telegraph, Mail Online and my favourite cocktail making site, Make Me a Cocktail have signed up to receive EiM.

Good to have you all on board — BW

It’s a war of words

We know looking through thousands of posts a day and intervening in spats online can be draining. But little is really known about the actual long term health effects of moderating content.

However, a woman that worked for one of Facebook’s third party contractors is making a direct link between her time reviewing flagged posts and subsequent mental health issues. In a case brought in California this week, Selena Scola claims that after just nine months working as a content manager, she suffered anxiety, social anxiety and fatigue before being diagnosed with PTSD.

It comes just over 100 years since English physician Charles Myers wrote the first paper on shell shock, which wrongly concluded that the physical effect of shell blasts led to an inability to communicate and in some cases amnesia. Although research has come a long way, there’s still a gap in knowledge about how these symptoms persist and why. Scola’s case adds further questions.

It also reminded me of some research done in 2015 by Sam Dubberley at Eyewitness Media Hub. His study found that individuals working with graphic or distressing eye witness content showed symptoms of PTSD, leading in some cases to a withdrawal from social activities and an over reliance on alcohol.

Storyful, the social media verification agency set up in 2010, responded by setting up an Employee Assistant Programme to help staff deal with issues that might affect their wellbeing. Even if Scola’s case doesn’t lead anywhere, it’s something Facebook and its partners should consider more seriously too.

MMA site fighting back

It’s a couple of weeks old now but Cageside Seats, the SBNation site for male-dominated pro wrestling and MMA has written an update about their community guidelines. While it covers a lot of ground that other policies do, it's notable for its strongly worded warning against ‘comments dripping in a condescending tone, smug arrogance, and holier-than-thou language'. My favourite line? ‘It’s not ok to post a gif of Steve Austin flipping (another commenter) off’.

All community guidelines should be as entertaining.

A kinder Grindr

Certain kinds of discriminatory language have, I've been told, been a staple on Grindr since its launch in 2009. Users often use 'No fats, no femmes, no Asians’ or ‘black=block’ to ward certain groups from swiping right.

However, after a lawsuit was brought against the company for failing to tackle it, they have updated to their community guidelines and said that any such language will now be subject to review and removal.

Not forgetting

Marc Randazza Is Fighting To Keep Nazis And Trolls On Twitter In The New Speech Wars. Here’s Why.

The new speech wars are a push and pull between individuals, governments, and platforms.

Good, in-depth Buzzfeed profile of the lawyer behind Alex Jones’ defence, who makes his own defence of near-total free speech as an "unalloyed cultural virtue in the eyes of an American public who has seen unfettered free speech on the internet help flay the country apart". It's a long read but worthwhile.

Steam Discussions - Moderation Update

We want to give developers an update on what we’ve been working on related to community moderation and let you know about an upcoming addition to how we support your game communities on Steam

Steam, the video games platform developed by Valve, have announced they will start reviewing posts flagged by users after games developers (some of whom have their own moderation team) said they wanted help with shaping the discussion (I’m not a regular Steam user but if you are and have thoughts on the change, reply and let me know)

PayPal bans Infowars for promoting hate - The Verge

PayPal will no longer do business with Alex Jones or Infowars, saying the site "promoted hate and discriminatory intolerance against certain communities and religions."

They’re late to the party but Paypal have become the latest service to ban Infowars after their ‘extensive review’ found Alex Jones’ site ‘promoted hate and discriminatory intolerance’ (which is presumably worse than regular intolerance)

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.