The news cycle after the tragic shootings in Dayton and El Paso naturally moved on to 8chan and content moderation this week. I’ve included some of the best reads on the matter below.
I’m on holiday next week and will likely only send an EiM if it’s raining in Cornwall and I run out of board games to play. Feel free to send me suggestions (I’m a Ticket to Ride/7 Wonders kind of guy since you asked)
Enjoy your weekend — BW
The moderators fighting for their mental health
In 2017, it was Henry Soto and Greg Blauert. Last year, Selena Scola led the way. Now, in 2019, a new group of former moderators are bringing a claim against a tech company for failing to protect their physical and mental health.
Last week, an Irish law firm announced that they were working on behalf of a group of content moderators to sue Facebook for 'distress caused by their methods of work and the effect that has had on their mental and physical health’. (If you’re wondering why Ireland, Facebook’s HQ for Europe, the Middle East and Africa is in Dublin l plus outsourced organisers such as CPL based in the city). With previous cases coming out of US, this new development is significant.
The physical and mental impact of moderation has rightly become part of the conversation over the last year. This piece from Jennifer Beckett at the University of Melbourne goes more into why but you really only have to read the account of Chris Gray, a Facebook moderator at CPL, to understand that the job is not normal for the effect it has.
For me, it boils down to this: people aren’t disposable. You can’t give well-meaning folks a job that knowingly affects their health without providing ongoing support and creating a culture where utilising that support isn't frowned upon. End of.
Elizabeth Farries of the Irish Council for Civil Liberties went one further than that when she said, on the back of last week’s news, that content moderators should be afforded the same care and protection as social workers and doctors.
It’s an idea that won’t sit well with many (my partner is a doctor and I have many social worker friends) but one that, depending on the outcome of the case against Facebook in Dublin’s courts, might be ripe for review.
This good read in Salon looks at 8chan in the wake of last week’s US shootings and says such forums will almost always devolve into fascist or white nationalist garbage fires.
8chan was not unique: selection biases and online psychology mean all unmoderated forums will devolve
Recode reviewed how these platforms handle content that promotes violence and spreads hateful ideologies.
El Paso and Dayton shootings put focus on social media’s handling of white supremacist content - Vox
In light of mass shootings in El Paso, Texas and Dayton, Ohio over the weekend, Recode reviewed how Facebook, YouTube, Twitter, and 8chan platforms handle content and users that promote violence and spread hateful ideologies.
This Politico piece is also noteworthy if you want to be reminded how difficult this all is.
President Donald Trump's call for increased scrutiny on violent extremism online runs up against fringe websites that may be hard to pressure.
NetzDG is having unintended consequences: American Twitter users are changing their location to Germany so they don’t have to see fascist posts
Twitter users are changing their location setting to Germany to clear their feed of white supremacist tweets
Due to Germany's stringent hate-speech laws, Twitter is much more diligent about removing racist and anti-Semitic tweets from timelines there.
Instagram mistakenly hid content from St Lucia carnival after two hashtags were muted by mistake.
The popular platform says a "mistake" was made by hiding posts made by festival-goers.
A managing director from Dorset isn’t happy with Glassdoor’s community guidelines — he was called a racist by an anonymous poster
Glassdoor has defended its processes that enable posters to review their employer following claims in this weekend’s Sunday Telegraph that the site is “beset with problems”.
Washington Post has written several pieces about Section 230 (see last week’s EiM). This is the latest op-ed
Critics claim that the law requires sites like Facebook and Twitter to be politically neutral. That’s not what the law says — if it did, no one would like the results.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.