Hello everyone. I’m writing this from Chopin airport in Warsaw, where I’ve been this week to take part in Outriders' two-day journalism event.
I did a workshop with some brilliant community-driven storytellers and learnt a boat-load about central and eastern European media organisations (including how some approach moderation).
This is the first EiM for two weeks — a bunch of deadlines last week put paid top my usual reading schedule — so this one includes more links than usual.
Thanks for reading — BW
Hard at work in Hamburg
It’s coming up to a year yet since the creation of the Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression (why do these groups always have ridiculous names) but it seems that the commission of policy experts from the US and EU is starting to get into gear.
A new report was published last week on ways to address hate speech, following two reports in the summer, and a number of the members were in Hamburg this week for an OSCE (Operation for Security and Co-operation) event (see tweet below).
Harlem Desir, OSCE's representative on media freedom, has a useful thread of the main quotes, the most eye-catching of which came from Barbora Bukovska from Article 19 on the need to apply human rights frameworks to the digital space.
PS I made a Twitter list of all the members of the TWG group which I find useful to check back in on now and again. Feel free to use it too.
Two Hat, the Canadian tech company specialising in content moderation solutions, announced a $7.5m round of funding this week. The money will be used to grow the company's client base and better develop the AI models in 20 languages.
Tam Holdings leads round with participation from Taubman Capital and Makers Fun
There’s money to be had in moderation. That’s the gist of Business Insider Intelligence’s content moderation report. At least, that’s what I think it says: I’ve only read the summary because it costs an eye-watering $495. Big business in its own right.
This report analyzes a pressing issue currently facing social platforms — content moderation — and lays out how we expect the debate to evolve.
Giphy, the image search site, continues to host child sex abuse imagery, despite attempting to ban illicit content earlier this year.
Researchers found pedophiles were using Giphy to spreading illegal materials online
Pinterest has taken inspiration from dialectical behaviour therapy to help users looking for content relating to self-harm.
The company has gotten better at removing distressing content. Now it wants to help users feel better.
In the latest of several article on this theme, Elizabeth King outlines how Twitter’s inconsistent enforcements of its policies allowed far-right trolls to get her account suspended.
When I woke up on Saturday, I found that my Twitter account was permanently suspended. As far as I could tell, individuals on the far-right had launched a campaign to mass reported by account and got
I try to keep EiM politics-free but there’s an interesting nugget in this General Election read on how the creator of a popular Facebook group in Canterbury was invited to California for training on how to moderate the group effectively
Some call it toxic, but a lively local Facebook group has brought young and old together to trade memes and debate
The Quint have a good Q&A with Mastadon founder Eugen Rochko about why it has fewer disruptive posters than other sites and why it's able moderate its main instance with a paltry five people.
Mastodon is an open-source social networking service that allows users to either host their own “community” or join an existing one. The Quint spoke to Eugen Rochko, who founded the main server on which many Indians have signed up in the past week.
I don’t want to brag* but I got 6/6 on Facebook’s mini-quiz on its snazzy new page about enforcement. Try it and let me know what you got.
(* It's incredibly easy)
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.