📌 Culturally conservative? Don’t bother applying
I’m sending this from Munich where I’m attending Medientage and using my AS Level German to bad effect. Guten tag to new subscribers from Essex University, Newsflare and Contribly and others.
I was in Cardiff with work for two days last week and then attended a very good product development academy (read my frustrated Twitter thread here) so this is my first newsletter for a couple of weeks. There is lots to get at so I’ll get going.
Thanks for reading — BW
Adjust your idea of a moderator
The depiction of moderators (professional or not) is almost as one-dimensional as that of computer hackers (you know, silhouetted hoodies, hunched over a computer, binary code floating in the background).
Moderators are either faceless or have their hands over their face in despair. They can be scruffy and often, for no reason, they’re female. And often are they South Asian (most likely from the Philippines, where the film The Cleaners is set and where many third party moderation firms are based).
Pete Friedman has a different characterisation. The CEO of LiveWorld, apparently one of the early companies to provide moderation services for the likes of eBay and Apple, recently gave some thoughts on what makes a good moderator (scroll to the bullet points at the bottom).
He says good moderators have 'emotional maturity’ and are ‘someone who already embraces social media’ so ‘they realise it is good and bad’. He also says they should be made to ‘feel strong and empowered’, allowed and encouraged to make decisions. They are rarely culturally conservative. It’s almost a completely different picture to the traditional trope.
From my experience working with an in-house community and social media team moderating comments on a news site, I have to agree. The people who really took to the work and made an impact were thoughtful, questioning individuals, who were aware of the implications of their actions on the individual users and as a whole. And the readers loved them as a result.
Either because of their sensibilities or because they read widely, they would suggest interesting stories and curious viewpoints in our morning conference.
Of course, they’d get frustrated and occasionally have enough but they weren’t faceless and they (rarely) had their head in their hands.
So yeah, don't fall into the faceless hackers trap.
The Superbowl but for content moderation
That’s the best way I’d describe Content Moderation and the Future of Online Speech (COMO III). It’s organised by Kate Klonick, an assistant professor at St John’s University, Manhattan, who specialises in the governance of speech online. I'm pretty jealous of everyone who's there.
I’m planning to recap the main points in next week’s edition but in case you’re at a loose end, you can watch the stream here and follow along on Twitter using #como3.
Not forgetting
New America's Digital Deceit II report
The internet needs a new social contract rooted in transparency, privacy and competition, in order to combat digital disinformation.
Not directly related to moderation but New America, a research institute and tech lab, have enlisted two smart folks from the Omidyar Network and Shorenstein Centre to pen a report that suggests we need to regulate in order to create a ‘new social contract’ between users and the big platforms.
YouTube backed itself into a corner with Logan Paul and Pewdiepie - The Verge
By bringing back Logan Paul’s premium show, YouTube fans are asking whether or not the video company is consistent about how it enforces its rules
Another week, another inconsistent platform decision. This time, it’s YouTube and PewDiePie, whose show was cancelled because he held up an anti-semitic sign on a video. Well now, he’s complaining Logan Paul, another YouTube who did something equally stupid but got his show back, has got away with it
New Technology to Fight Child Exploitation | Facebook Newsroom
One of our most important responsibilities is keeping children safe on Facebook.
Facebook has unveiled lots of good work it has done over the last 12 months to spot underage nudity and curb the effect of child exploitation. It’s interesting but just a shame only 141 people have watched their video explaining how they did it
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.