4 min read

📌Content advisory councils, Facebook's legal nemesis and bans on banter

The week in content moderation - edition #102

Welcome to Everything in Moderation, your weekly newsletter about content moderation by me, Ben Whitelaw.

Today's edition is going to 400 subscribers for the first time, including new folks from Snap, Bind and N-Square. I won't celebrate too much as the threat of unsubscribing always looms large but it's a notable mini-milestone nonetheless. Thank you all for signing up, reading and supporting every week.

This week has a lot of platform updates, none of which I felt I could leave out. Let's jump in — BW


📜 Policies - regulation, legislation and new guidelines

If advisory councils are a practical way of filling the competency gaps in a company, TikTok has many deficiencies when it comes to moderation. This week, the video-sharing app launched another council, this time focusing on Europe, as part of a new effort to "provide subject matter expertise as they advise on our content moderation policies and practices".

It comes after the announcement in March 2020 of a US-focused TikTok council and also the launch of Twitch's Advisory Council in May last year. Twitter, you may remember, set one up all the way back in 2016.

With regulation happening all over the globe, the need for a human rights-based approach to content governance remains important. Sustainable business consultancy BSR has produced a report outlining four areas where this will play out and the practical approaches to content governance that can be taken by companies. A good primer on the topic.

💡 Products - features, functionality and value proposition

"Our moderators face an impossible task and the site’s staff needs to do more to support them": So says the managing editor of Blueshirt Banter, the SB Nation fansite for the New York Rangers ice hockey team, in a blog post this week. It, like many sites right now, is adopting a harder line with users and will be dishing out more bans. Good luck to them.

As community becomes a buzzword in tech VC (again), letting people decide what they see and who they interact with becomes ever-more important for products. So I was interested to read this dog walker's Medium post about the feature that led her to use Wag, an on-demand dog walking app, over Rover, its longstanding competitor: the ability to block time-wasting dog owners.

💬 Platforms - about the dominant digital networks

I mentioned last week (EiM #101) that it was transparency report season and I wasn't joking: two more were published this week.

  • Twitch released its first transparency report detailing an increase in user reports and channels bans that it says is in line with the growth it has seen during Covid-19. In an interview with Wired, the new head of trust and safety Angela Hession explained the investments the company had made over the last year but did not say how many mods it employed.
  • Yelp had 18m reviews in 2020, over 700k of which were removed for violating its policies, and closed 100,000+ accounts for fraudulent activity, according to its new transparency report. That might sound like a lot but it compares favourably with Google Maps and Tripadvisor, it explains.

The Oversight Board, the so-called independent body funded by Facebook, is frustrated by its current scope, according to one of its current members. Alan Rusbridger, the former editor of The Guardian, told a UK House of Lords committee that it was likely to ask to see Facebook's algorithm but "we have to get our feet under the table first".

Meanwhile, the Board announced two new cases: one about support for Alexei Navalny and the other about a Turkish version of the 'two buttons' meme.

Twitter is countering vaccine misinformation by expanding the use of its five strike system to help "educate people on why certain content breaks our rules". In a blog post, it explained how it had challenged 11.5 million accounts globally since bringing in the strike system in early 2020.

Reddit has no plans to further remove porn from the website, according to its CEO Steve Huffman. In an interview with Axios, he said that while some porn is exploitative, :there’s another aspect that’s empowering". Involuntary pornography made up 6.7% of the content removed as part of violations in 2020 (EiM #101).

👥 People - those shaping the future of content moderation

I've followed Foxglove, the law firm founded by Cori Crider, Martha Dark and Rosa Curling, since it was founded back in 2019. It has become best known for its work working with former Facebook moderators (EiM #89) but, as Cori explains in this very good interview published on Sunday, it is "intensely focused on building tech-worker power" more generally.

In particular, I'm pleased to see her insist that moderators, as a class of workers, need to follow Google and Amazon employees, in unionising. I wrote about how I felt it was inevitable back in March 2019 (EiM #18) and am just glad Cori and the Foxglove team are around now to take up the cause.

🐦 Tweets of note

  • 'My aim is to continue being a critical friend' - Seyi Akiwowo, Glitch's CEO, on becoming part of the new TikTok European Trust and Safety Advisory Council.
  • 'I love sea shanties and dance memes, but TikTok is also home to something much worse' - Cameron Hickey, Program Director for Algorithmic Transparency at the National Conference on Citizenship, with a thread on how TikTok has escaped the scrutiny.
  • 'It's super important for me to read the daily list of usernames as a form of content moderation' - DialUp founder Danielle Baskin doing something that Zuckerberg couldn't and probably would never do.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.