Hello and welcome to Everything in Moderation, your weekly newsletter on content moderation, sent from a surprisingly sunny UK.
A special bongiorno to folks who came here via the excellent Valerio’s weekly newsletter on media, digital strategy and product development. Check out the beautiful logo and subscribe to practice your rarely-used Italian.
I've taken the opportunity presented by the influx of smart product thinking folks (👋🏽) to explain why I'm hopeful about the tools and systems being built to improve the quality of online speech and what that means for EiM.
Stay safe and thanks for reading — BW
💡 Power to the product people
When I sent the first edition of Everything in Moderation almost two years, I set out to look at ’the policies, people and platforms' that make content moderation on the web happen.
I chose this focus and framing because 1) they were the aspects of moderation I thought other would-be subscribers might find interesting and 2) my experience is predominantly about these three things in the context of UK newsrooms.
75 editions of the newsletter later, I continue to cover these three aspects of content moderation; I write about the Santa Clara Principles and transparency reports, I note the plight of moderators and feature Reddit superusers, and very often (perhaps too often) I take a look at Twitter’s latest takedown or how Facebook is using 20 experienced law and technology experts to legitimise their content moderation decisions.
Increasingly though, I believe that I’m missing something in my coverage of content moderation: product. And I want to quickly walk through two reasons why I’ve come to that conclusion now.
1/ A move toward friction
A humane tech movement is making the digital dominant platforms see that more posts, likes and shares might mean more ads (and more revenue) but isn’t always better for society as a whole. Friction is being introduced to products to curb unhelpful behaviour; Instagram now uses AI to ask users to reconsider posting a comment that looks like bullying and in June, Twitter began asking users if they would like to read the article before retweeting it.
As David Ryan Polgar, founding advisor on TikTok’s content advisory council, noted recently in a blog post:
“The future of social media depends on friction: finding innovative ways to nudge people away from their impulses and move them toward a more intentional form of communication."
Of course, platforms still do dumb stuff but these type of nudges — exploring ideas of scarcity, choice and the impact of decisions — have huge potential when done right.
2/ A movement of product folks who care about online speech
Maybe they simply lacked visibility or perhaps I wasn’t looking for them hard enough. Either way, it feels like there is a wave of engineers (but also designers and product managers) interested in building tools to reduce abuse and support healthy online conversation.
In the past few weeks alone, I've come across Mac Reddin, who built Modagrate and works at Commsor, and Max Wang, the former Facebook engineer who posted a video saying that big Blue hadn’t been 'paying enough attention to the raw human needs of the people who use our platform.' They join the likes of Tracy Chou, formerly of Quora and Pinterest and currently building Block Party, and Devon Zuegel at Github, who have both been thinking about the unintended consequences of online abuse for many years. There are countless others I know I've missed or not yet come across (I'd love your recommendations, especially in non-Anglosphere countries).
Why does this renewed focus on mod product ('Moduct' anyone?) matter? Put simply, the content we get is a direct result of the tools we use and the patterns of behaviour that are encouraged.
We've seen it time and time again. Ask someone to write in 140 characters bites and they will struggle to be nuanced, thus making debate difficult. Rank content by mysterious engagement-based algorithms and users will be more extreme. Make it super-easy to forward content from one group to the next and, unsurprisingly, personal data can quickly make its way into the wrong people’s hands.
It’s one of the reasons why I believe that commenting under news sites is a nightmare — because, for so long, the products on the market were poor, underdeveloped and perpetuated antagonistic below-the-line discussion (I'm looking at you Livefyre and Disqus). News product teams didn't demand any better and wouldn't build better systems themselves.
So I'm glad about the rise of friction tech and for engineers that care about online speech. And, in the future, probably after a summer break later this month, I’ll add product to the mix of topics that Everything in Moderation covers. (If you have specific thoughts on what this could look like, drop me a line.)
🌱 Community or scale? VCs difficult dilemma
On the subject of building products with community in mind, Sarah Drinkwater — formerly at Google Startup UK and now at Omidyar Network — has written a great piece about a trend among VCs (venture capitalists) on Twitter right now: startups centred around community.
Sarah rightly notes the:
“tension between what scale demands and what most, if not all, communities need. Simply put, the strength of community is usually (if not always) weakened as the number of people in it grows”.
To help, she also calls i) for the introduction of VPs of Community and Chief Community Officers and ii) for better internal metrics that show how people are coming together, learning from each other and benefiting from one another’s company and expertise, not just seeing content or clicking on ads.
I agree with everything she says and recommend reading her whole piece in full.
🐚 Not forgetting...
NetEase, China’s major music streaming app that allows users to leave comments under songs, has promised to hire more content moderators to ‘help create a positive and friendly atmosphere’ after comments mocking the popularity of depressing tunes took off.
Music streaming app NetEase to step up content moderation after trolls target depressed people during pandemic | South China Morning Post
NetEase said it will take measures to regulate comments posted on its music streaming platform after receiving criticism that the app had become too ‘depressing’
A Sudanese militia group that clearly breaks Facebook’s Community guidelines continues to remain on the platform, despite a year-long campaign to have it removed.
“Despite the harrowing violations, the RSF maintains a presence on social media, most notably Facebook, which has been the main platform for this militia to spread its messages …”
A new law ushered through parliament in Turkey forces social networks with over 2 million users in the country to appoint a designated representative. Critics are calling it ‘worse the NetzDG'.
Turkey's New Internet Law Is the Worst Version of Germany's NetzDG Yet | Electronic Frontier Foundation
Update: This post has been corrected as of August 1, 2020 to accurately reflect the details of the NetzDG.For years, free speech and press freedoms have been under attack in Turkey.
Buzzfeed has a good story about policy execs at Facebook, including its VP of Global Public Policy, intervening in fact-checks on behalf of right-wing publishers. Grim but perhaps not surprising given Republicans' constant bleating over the last two years about bias.
Facebook Employees Ask Zuckerberg What Would Happen If Trump Used Their Platform To Dispute Election Results
Facebook employees collected evidence showing the company is giving right-wing pages preferential treatment when it comes to misinformation. And they’re worried about how the company will handle the p
Also Facebook-related, 20 US attorney generals have written to Mark Zuckerberg with seven recommendations to prevent hate and abuse on its platform, including improving tools to block and reports users.
The call from 20 state officials adds to the rising pressure facing Mark Zuckerberg and his company.
The Initiative on Intermediaries and Information, a Yale/Wikimedia research project, has launched a series of articles with 'academics, civil society activists and journalists whose work lies on the sharp edge of content decisions’. Looking forward to reading these.
Moderate globally, impact locally: A series on content moderation in the Global South · Global Voices
''Even as the platforms have grown and spread around the world, the center of gravity of these debates continues to revolve around D.C. and San Francisco.''
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.