š Content moderators = social workers
The news cycle after the tragic shootings in Dayton and El Paso naturally moved on to 8chan and content moderation this week. Iāve included some of the best reads on the matter below.
Iām on holiday next week and will likely only send an EiM if itās raining in Cornwall and I run out of board games to play. Feel free to send me suggestions (Iām a Ticket to Ride/7 Wonders kind of guy since you asked)
Enjoy your weekend āĀ BW
The moderators fighting for their mental health
In 2017, it was Henry Soto and Greg Blauert. Last year, Selena Scola led the way. Now, in 2019, a new group of former moderators are bringing a claim against a tech company for failing to protect their physical and mental health.
Last week, an Irish law firm announced that they were working on behalf of a group of content moderators to sue Facebook for 'distress caused by their methods of work and the effect that has had on their mental and physical healthā. (If youāre wondering why Ireland, Facebookās HQ for Europe, the Middle East and Africa is in Dublin l plus outsourced organisers such as CPL based in the city). With previous cases coming out of US, this new development is significant.
The physical and mental impact of moderation has rightly become part of the conversation over the last year. This piece from Jennifer Beckett at the University of Melbourne goes more into why but you really only have to read the account of Chris Gray, a Facebook moderator at CPL, to understand that the job is not normal for the effect it has.
For me, it boils down to this: people arenāt disposable. You canāt give well-meaning folks a job that knowingly affects their health without providing ongoing support and creating a culture where utilising that support isn't frowned upon. End of.
Elizabeth Farries of the Irish Council for Civil Liberties went one further than that when she said, on the back of last weekās news, that content moderators should be afforded the same care and protection as social workers and doctors.
Itās an idea that wonāt sit well with many (my partner is a doctor and I have many social worker friends) but one that, depending on the outcome of the case against Facebook in Dublinās courts, might be ripe for review.
Not forgetting...
This good read in Salon looks at 8chan in the wake of last weekās US shootings and says such forums will almost always devolve into fascist or white nationalist garbage fires.
Why unmoderated online forums always degenerate into fascism | Salon.com
8chan was not unique: selection biases and online psychology mean all unmoderated forums will devolve
Recode reviewed how these platforms handle content that promotes violence and spreads hateful ideologies.
El Paso and Dayton shootings put focus on social mediaās handling of white supremacist content - Vox
In light of mass shootings in El Paso, Texas and Dayton, Ohio over the weekend, Recode reviewed how Facebook, YouTube, Twitter, and 8chan platforms handle content and users that promote violence and spread hateful ideologies.
This Politico piece is also noteworthy if you want to be reminded how difficult this all is.
How do you solve a problem like 8chan? - POLITICO
President Donald Trump's call for increased scrutiny on violent extremism online runs up against fringe websites that may be hard to pressure.
NetzDG is having unintended consequences: American Twitter users are changing their location to Germany so they donāt have to see fascist posts
Twitter users are changing their location setting to Germany to clear their feed of white supremacist tweets
Due to Germany's stringent hate-speech laws, Twitter is much more diligent about removing racist and anti-Semitic tweets from timelines there.
Instagram mistakenly hid content from St Lucia carnival after two hashtags were muted by mistake.
Instagram Apologizes for Blocking Caribbean Carnival Content - VICE
The popular platform says a "mistake" was made by hiding posts made by festival-goers.
A managing director from Dorset isnāt happy with Glassdoorās community guidelines ā he was called a racist by an anonymous poster
Glassdoor defends anonymous posters following claims of offensive postings | Recruiter
Glassdoor has defended its processes that enable posters to review their employer following claims in this weekendās Sunday Telegraph that the site is ābeset with problemsā.
Washington Post has written several pieces about Section 230 (see last weekās EiM). This is the latest op-ed
The stubborn, misguided myth that Internet platforms must be āneutralā - The Washington Post
Critics claim that the law requires sites like Facebook and Twitter to be politically neutral. Thatās not what the law says ā if it did, no one would like the results.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.