9 min read

📌 How to move on from Reddit’s moderator revolt

The week in content moderation - edition #69

Hello and a warm welcome to new Everything in Moderation subscribers from New York University and Blick. Longstanding recipients of EiM, thanks for your ongoing support.

Following the success of last month’s Q&A with community editor Hajra Rahim, I reached out to another expert to get his take on one of the big issues of the moment: Reddit’s recent mod revolt. It's a longer-than-usual update so grab whatever drink you normally take at this time on a Friday before you jump in.

Stay safe and thanks for reading — BW

PS: Wanna help me make this newsletter better and more inclusive? I’m looking to conduct paid interviews with folks (particularly women and people of colour) and have extended the days/times I can do. Does one work for you? Book a slot or hit reply.


💔 "It's not up to us mods to fix Reddit"

If Donald Trump hadn’t signed an executive order that threw into question the fundamental way that the web works (see EiM #66), the biggest content policy story of the last few weeks wouldn’t have been the President. It would have been Reddit.

The day before Trump mounted his legislative attack on Section 230, David Pierce at Protocol published a well-researched piece about an almighty moderation ruckus going on at the platform. David noted how voluntary mods were:

"receiving death threats because of a misleading list (of subreddits and moderators which was circulated on the platform) and for simply trying to do their part to make Reddit better.”

In the weeks since, the company has come under further scrutiny for their moderation practices and what its own mods call ‘the normalisation of bigotry, hate and violence’.

Most notably, over 800 subreddits signed an open letter to CEO Steve Huffman asking him to 'stand up to white supremacy on this site’ by banning hate-based communities and users and hiring more community managers. The tools at their disposal (EiM #54), they say, are not enough to stem the tide of bile.

Huffman, to his credit, made clear that he intends 'to take responsibility for the history of our policies over the years’ but not before former CEO Ellen Pao could criticise his approach. Likewise, Casey Newton — The Verge reporter that is a thorn in the side of so many companies with ineffective moderation practices — suggested that volunteer-led moderation like that seen at Reddit was 'empowering racists'.

I wanted to understand a bit more about the specific challenges Reddit faces so I asked Rob Allam aka Gallowboob— who is quoted in the Protocol piece — to explain a bit more about what it is like for him as a mod and as a longtime user.

Rob and some fan art of Rob as a foot

Rob is in a unique position to comment in that he has 32,000 followers, a ton of karma and has been a mod of dozens of subreddits since joining in 2014. He has been outspoken about the challenges of a volunteer moderation system, including in a recent interview on Know Your Meme. We met once or twice when I lived in London and share a sense of the challenge facing dominant digital platforms when it comes to content moderation.

Below is a lightly edited back-and-forth that Rob and I had via email this week.

Q As a user, you have been targeted by others on Reddit many times — what's the worst situation you've been on the receiving end of?

I've had the displeasure of being the biggest account in Reddit history while also becoming a very active moderator of a lot of booming communities; all of this while being one of the few users on the platform not hiding behind anonymity. I've never had anything to hide and have always been very vocal and transparent about what I do and don't do on Reddit in past interviews, podcasts, commentaries and more. This doesn't matter to hateful conspiracy-fuelled anti-Reddit hate subs; they see me as an easy target and they don't hold back. Instead of giving them a "best of" award for the worst thing they have done, I'd rather just address the issue at hand — misinformation and harassment is running rampant on one of the biggest and most unique social platforms today. I've lately decided to step back and to launch a mental health effort on the platform, funded by myself because I want to and because I can, but most importantly because it is much needed.

Q You moderate dozens of subreddits. Can you give EiM subscribers a sense of the type of work you do as a mod daily?

I recently shared two grabs (above) that show my own current moderation spectrum as well as the top "power mod" accounts who moderate the most users on Reddit. I've left a lot of subs lately for all the reasons mentioned above and in this (Protocol) article reflecting why I did it. You can see I'm only currently moderating 5-10 active subs compared to hundreds I previously used moderately. The site is not secure enough to do volunteer free labour work at the expense of one's own peace of mind and physical security. It just doesn't add up anymore. As a moderator on Reddit, what you do goes unseen and under-appreciated by 110% of the audience. You have to clean up spam and NSFW content and borderline criminal instances of a varied spectrum of horrors that we get to clean up for no-one else to be exposed to, from our own audience to Reddit's general one, and advertisers coming on Reddit to buy native ads or take part in the conversations.

What moderating a community should be is engaging your audience and creating events and instances around your audience's interests. It should be a manager rather than a janitor. But, as it stands, we have to clean up mess for the most part and we rarely get to ever engage our audience directly or positively. Mod-hate on Reddit is at a record high, the most ardent it's ever been in years, due to misinformation and astroturfing running rampant on the platform. Moderation on Reddit needs to be reassessed, to say the least.

Q You were recently referenced in an article about Reddit's PowerMods - do you think power is concentrated in too few people's hands?

Moderation on Reddit can be easily abused as it already has been in the past; for example, the top mods of r/natureisfuckinglit trying to manipulate the Reddit algorithm to their benefit and get paid for certain promotions on their now-massive subreddit. They were replaced by new mods and their accounts were suspended but we don't know exactly what went down since Reddit never addresses these actions beyond a static "thank you for reporting" message. One fix could include limiting the number of subs/users one single mod can moderate since the general audience is fearful of a monopoly of communities by a few users who could potentially be biased in their moderation against a socio-political opposition. Honestly speaking, it's not up to us mods to fix Reddit; we're supposed to be volunteers who are here in good faith because they love the platform and enjoy allocating personal time into these communities because they care. As it stands today, not a single co-mod can say it isn't affecting their peace of mind and, while they might all have varied opinions as to what needs to be done to "fix" Reddit, that tells you that the current system could crumble at any time. Solutions need to be provided by Reddit's administration and we will align with those accordingly or step down if we disagree with what they are doing (or failing to do).

Q Is it realistic in the long-term that moderating communities — on Reddit, Discord, Slack, wherever — is done by volunteers in their spare time?

In essence volunteer moderation is free labour and I have a hard time accepting that the current volunteer system on Reddit is not beyond broken. There are massive pros to volunteer moderation, for instance, it cannot (in theory) be as biased as paying a third party or internal team could be when it comes down to enforcing your ToS (terms of service) and sub-specific rules. Take Facebook for example; a lot of their bias is documented and all of it comes from specifically paid moderation that can be asked to enforce or ignore certain instances depending on whether or not they align with the company's supposed vision. Reddit is a curation platform, it's good to have volunteer moderation on one of the most engaged platforms worldwide, but the current way it's set up leaves both the mods and the community exposed to a lot of cracks. All it takes is the administration revisiting it and updating it according to today's site-wide struggles.

Q If you had your say, which of the steps called for in last week's letter would you prioritise? And is there any that you think have been missed?

I think the subreddits did the right thing and included almost everything worth addressing. As you can expect a lot of hate anti-Reddit communities strongly oppose this letter and the solidarity behind all these communities and their mod teams. What it has done is shine a light on their borderline criminal activity in the hopes for positive change. The misinformation and harassment that came against the mods of r/againsthatesubreddits and almost all subs on that list were abysmal and not surprising. It's still ongoing today.

Q Finally, what would you do if you met one your 'trolls' in real life? What would you say to them?

I'd suggest they remember the human before reaching out to me or anyone else online in bad faith and with ill wishes. Do your research. Put facts over the outrage. Don't use a lack of hard evidence as an incentive to further perpetuate misinformation and slander. An angry comment or post on Reddit lacking facts does not make it proof or factual. This applies to Reddit and people's overall media consumption; don't be gullible and part of the problem. The issue is global too; we need better education on how you consume content and fact-check your sources before using your voice to perpetuate false, misleading and damaging information. It does us a disservice as a human collective today. I've addressed everything multiple times and will continue to do so. I have nothing to hide and a lot to share about my personal experience's pros and cons on and off Reddit.

A big thanks to Rob for taking the time to answer my questions. Who else would you like me to do a Q&A with? Let me know.

🎙 A (voice) note of caution

How much abuse can you squeeze into 140 seconds? That’s what I thought when Twitter this week announced that is was launching audio tweets on iOS.

Don't worry, I’m not planning to launch a public verbal tirade against anyone I know. But I do know that audio (just like video) is incredibly difficult to moderate effectively and consistently, as Sameer points out.

I give it less than a month before someone includes hate speech, death threats or far-right dog whistles in an otherwise innocuous audio clip. Good going Twitter.

👀 Not forgetting...

This is about the worst that it can get in terms of automated moderation. An Australian Facebook user was banned for posting a picture that supposedly included nudity (it didn’t), only for the resulting Guardian story about the banning to also be removed because it included the same picture. Jeez.

Not just nipples: how Facebook's AI struggles to detect misinformation | Facebook | The Guardian

Not just nipples: how Facebook's AI struggles to detect misinformation | Facebook | The Guardian

A new poll from the Knight Foundation and Gallup has found that Americans (unsurprisingly) don’t have faith in social media companies moderation practices but that they do (surprisingly I'd say) like the idea of Oversight Boards.

8 in 10 Americans Don’t Trust Platforms to Moderate Content – Adweek

A Knight Foundation-Gallup poll shows partisan splits on who has faith in social media companies to self-police their platforms.

He would say this as the CEO of toxicity company but Zohar Levkovitz argues that artificial intelligence that takes into account context is the best way to limit the exposure of harmful content to moderators and users alike.

Content moderators alone can't clean up our toxic internet

Tech platforms must invest in more automated solutions that can analyze the full context of online conversations—not just detect keywords.

If you enjoyed last week’s EiM (#68) on the end of outsourced moderation, you may enjoy the webinar that took place yesterday with report author Paul M Barrett and two other experts.

WEBINAR: Who Moderates the Social Media Giants?

An hour of timely discussion and audience Q&A on one of the most pressing issues facing the social media industry and its billions of users: How are Facebook...

One to bookmark: the Brookings Institute has produced a handy guide of online content moderation lessons from outside the US. For a long time, I wasn’t sure many Americans acknowledged that moderation was a global issue so I'm glad to see this.

Online content moderation lessons from outside the US

In the wake of America's debates on content moderation, David Morar and Bruna Santos analyze other countries' actions to suggest a way forward.


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.