๐ Tips for moderating subreddits, inside ByteDance and being anti-algorithm
Welcome to Everything in Moderation, your weekly newsletter about content moderation by me, Ben Whitelaw.
Welcome to new subscribers from Kinzen, Princeton University, Alltrails, The Hebrew University of Journalism and a host of others. If you're comfortable introducing yourself, hit reply and say hello. And if someone forwarded you this edition, you can sign up yourself here.
Without further ado, here's your review of this week's moderation news โย BW
๐ Policies - company guidelines and speech regulation
Why does moderation need transparency? In some ways, it's an argument that has been won (see: platform transparency reports, government committee appearances) and yet there's a lot more to consider. So this piece by Mark MacCarthy, a nonresident senior fellow at the Brookings Institution in Washington, is a welcome addition to work already published on the subject.
Content moderation is often seen through a US lens so its good to see IFEX convene eight experts from around the world in this interactive about public spaces, regulation and the need for transparent, human rights-approved processes.
Sheryl Sandberg, Facebook's chief operating officer, gave a one line response โ 'I'm fine with this' โ after Turkey requested that the platform censor the posts of the People's Protection Unit, a Kurdish militia group. Her email, published this week by ProPublica, demonstrates the dangerous ease with which governments can use court orders to insist on takedowns and the unwillingness of Facebook to challenge those orders.
๐ก Products - features and functionality
Algorithms that 'have a bias towards the sensational and outrageous' are to blame for 'amplifying harmful-but-sensational views and sentiments far out of proportion to their actual prevalence', according to this punchy op-ed by Reset Tech's policy advisor Poppy Woods.
She goes on to suggest that, rather than focusing on content alone, the UK's Online Harms Bill should focus on ''(re)designing these curation systems so that harmful material stops being algorithmically and automatically spread at scale". Which is exactly why I introduced a section in the newsletter for all matters product (credit to longtime subscriber Stephanie for that idea).
๐ฌ Platforms - dominant digital platforms
A ByteDance employee that created moderation tools for its Trust and Safety team has explained the censorship that took place at the company. In a fascinating read published by tech site Protocol, the tech workers โ writing under a pseudonym โ said they felt 'like I was a tiny cog in a vast, evil machine'. ByteDance owns TikTok as well as the Chinese video app, Douyin.
It's transparency report season for the platforms (like The Oscars but less glamorous). After Reddit shared its 2020 update last week (EiM #100), TikTok released its six-month report for the second half of last year. Axios covers the top lines (1% of overall videos removed, videos removed for hate speech doubled, child safety an ongoing problem) but there's way more when you dig into the report: for example, almost 3m videos were reinstated after they were appealed, suggesting TikTok has some seriously overzealous AI systems.
Facebook this week banned Myanmar's military from the platform, over three weeks after a military coup in the country and two and a half years after a report that showed the military used Facebook to systematically target Rohingya Muslims in the country. The Tatmadaw, as the military is known, was also banned from Instagram.
Marc Beaulac, the founder and lead moderator of Am I the Asshole? โ the famous subreddit with 2.6m followers โ is profiled in this LA Times piece along with 10 other Reddit mods. It's a great piece with lots of interesting nuggets (who knew usernames with โASDFโ or โJKL,โ hinted at a spam account? Not me) and even includes EiM friend and one-time interviewee Rob Allam.
๐ฅ People - those shaping the future of content moderation
Carolina Are knows about content moderation โ she's doing a PhD on it and has papers published on the topic โ but that hasn't stopped her feeling the effect of TikTok's abusive users and poor reporting system.
Her account was suspended after a video of her pole dancing (she runs classes) attracted over 3m views, leading to hundreds of users trolling and reporting her. Her account was only reinstated this week when a journalist reporting on her story for Input Mag got in touch with TikTok.
If someone with as much knowledge about moderation as Carolina (who you should all follow) can still suffer at the hands of a platform's opaque system, what hope does a regular person have?
๐ฆ Tweets of note
- 'Here are some of the questions we'll be discussing' - McGill University PhD student Helen Hayes gives me some hope about the future of tech policy.
- 'Now they're escaping that corner, becoming self-evident to people from traditional policy backgrounds.' - Cory Doctorow, author and activist, reflects on the work of Mike Masnick and Daphne Keller in a long, wide-ranging thread-come-blogpost.
- 'Supporting transgender Americans is NOT hate speech' - Congresswoman Marie Newman on being on the sharp (read: dumb) end of Facebook's AI detection tools.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.