I want to be up front with you all: This is a bumper edition of Everything in Moderation.
For new subscribers from the Reuters Institute for the Study of Journalism, the Foundation for European Progressive Studies and Sky, I need to be clear that the newsletter isn’t normally this long. But so much content moderation news happened this week — including more fallout from the Facebook Oversight Board announcement that I covered in last week’s newsletter — that it somehow felt justified. I hope you don’t mind.
I’m also delighted to feature a Q&A with The Telegraph's senior community editor, Hajra Rahim, on a topic that’s close to my heart (she's also a longtime EiM subscriber so that's cool too). Let me know what you think about the change in format.
Stay safe and thanks for reading — BW
🎁 ‘Commenting is a gift, not a right'
There are several thankless jobs in journalism but one of them is getting journalists to read and engage in the comments under their articles.
I spent almost eight years trying to do so at The Times and The Sunday Times as part of my role engaging readers in its journalism. I’ve written about my battle with users like Eddie Storey (EiM #7) and the ethical considerations of bozoing comments (EiM #28). But getting journalists to respond to (paying) users below the line was by far one of the most taxing parts of the job.
Why? Because it involved convincing not one but two groups of people (undervalued, often disgruntled, sometimes anonymous commenters and undervalued, often disgruntled journalists) to do things differently. It was slow and arduous work and I owe a lot to the team of moderators whose stewardship gradually made the comment section less of a shitshow.
Other people have the same scars. Tim Burrowes, content director at Mumbrella, a site about Australia's media and marketing sector, wrote an interesting blow-by-blow account earlier this year about its response to being beset by drive-by users. It’s worth a read in its entirety but it amounts to this:
As we began to enforce our own guidelines more rigorously and visibly, our readers began to notice. The trolls moved away, and the quality of the comments began to improve.
I was very interested, then, when UK newspaper The Telegraph recently announced that it was changing the way that commenting worked on its site.
Now, only paying subscribers are able to write comments (before it was users that registered with an email address too) while a new set of community guidelines sets out the ’timely, constructive and respectful’ comments that the Telegraph community team expect from its readers. A new team of moderators will also play a more hands-on role in the shaping of daily conversations.
I reached out to Beth Ashton, head of audience and subscriptions at The Telegraph, and Hajra Rahim, senior community editor, to ask them more about why they made the changes and how they think it will affect their work. Below is Hajra’s edited response to a few questions I sent over.
Q: What challenges did you face with the Telegraph community to make you want to change who could comment?
What you might usually see in comments sections - trolling, abuse of other users, general off-topic conversations - but also just a generally negative tone in a lot of comments sections across the site. There were also daily calls from regular commenters asking people to stay on topic and not to enter into arguments or abuse others.
This, in turn, made it the type of environment our journalists were less willing to enter. This is a big hindrance to creating a proper sense of community because we really want our writers to be a part of it.
Q: You mentioned a 'friendly new moderation team — who are they and what will their role be?
We have a team of three moderators (I won't name them!) and their role will first and foremost be to moderate comments flagged by our users. But importantly they will be taking an active role in the comments section. They will go into the comments section to thank readers for leaving good comments, get involved in the discussion themselves and generally try and steer debate so that it remains on-topic and respectful.
They will also help the Community team to identify topic areas that get our readers talking and suggest areas where we might be able to develop further conversation with that audience. That could be through an onsite Q&A, or getting an expert or journalist into the comments section.
Q: What tools are used to maintain a healthy debate on the site? (software as well as filtering or any other tips + tricks)
At the moment it is mostly us doing it manually as well as using our moderating software that shows us flagged comments, users etc. By manually, I mean we add calls to action to our stories where we ask readers a question based on the piece and tell them to tell us their answer or what they think in the comments section of that piece.
This, for the most part, does tend to steer the debate in the right direction.
Q: What is your biggest frustration about Telegraph commenters?
Some of them can often view commenting as a right, not a gift. They can often be a little troll-like on articles about fashion or beauty when some people are genuinely there to leave a nice comment and it's frustrating because those same commenters leave great comments on other articles.
What I will say is that for the most part, these people are few and far between, our most loyal commenters are very quick to call these types of commenters out or flag them because they are genuinely there to have discussions about what they are reading.
Q: One thing I always struggled with at The Times was abuse directed at journalists. What's the worst example you've come across and how did you resolve it?
Some commenters truly can't handle a woman having her say on any issue that reflects badly on men or anything relating to mental health. They often leave comments calling these women "snowflakes" and that their opinions count for less or can't be true because they are women.
To resolve these issues we have a few approaches. For outright abuse at our journalists, it is a ban on the spot. We will also send out warning emails to commenters who are perhaps being trolls but not necessarily abusive. This makes them aware that we are watching their behaviour and that if we see it again we will administer a ban on their account. Most of the time this is well received.
Lastly, and most recently we have been adding calls to action to stories that might normally attract negativity and keep comments sections open on them for a fixed amount of time while keeping a close eye. This allows our commenters to start and maintain a civil debate, but the first sign of nonsense and we will turn them off. Again, feeding into that idea that our comment sections are a gift not a right and if you want to comment on those topics you might usually disagree with, you have to do so in a way that is in line with our guidelines.
It has been a mixed bag so far but we hope that with more consistency this will be an approach that works well for us moving forward.
Thanks for Hajra for taking the time to answer questions. Who else would you like to hear from? Let me know.
💸 Health vs wealth
The big news of the week: Facebook has agreed to pay 11,250 current and former moderators a minimum of $1000 to compensate them for mental health issues sustained on the job. As many as half will be entitled to more pay, according to The Verge.
Safe to say, this is a significant judgement with global ramifications that I plan to get into more next week. For now, I'll just say that I’m with Annemarie on this one.
🚢 Man Overboard?
There can be such a thing as too much news about Facebook's Oversight Board; but, at the same time, this group of people will likely change the content governance of social platforms irrevocably (in which way, we don't quite know yet).
So, here are a few choice reads about its announcement from the last seven days:
- Deutsche Welle report that Board member and Nobel Peace Prize winner Tawakkol Karman has been targeted by Arab media for her alleged links to the Muslim Brotherhood via a Yemeni party she used to have ties with.
- Respected media columnist Margaret Sullivan at the Washington Post pours cold water on the concept of the Board: "Anyone who has ever served on a committee, especially a large one or one populated with big egos, knows that it’s not an ideal way to get things done.
- Mathew Ingram at the Columbia Journalism Review has produced some typically forensic coverage of the Board this week. Online discussions with Dave Kaye (UN special rapporteur for freedom of expression), Kate Klonick (St Johns Law) and Ellen Goodman (Rutgers Institute for Information Policy and Law) are worth re-reading but Mathew's wrap piece rounds up the best bits if you’re short of time.
- Nighat Dad, one of the Oversight Board Members, writes in DAWN that the representation of the Global South could be 'a precedent and a model for other social media companies to follow.'
One hour to remove paedophile and terrorism-related content, 24-hours to take down other ‘manifestly illicit’ posts: that's what France introduced on Wednesday as part of new legislation to limite online hate.
Social networks and other online content providers will have to remove paedophile and terrorism-related content from their platforms within the hour or face a fine of up to 4% of their global revenue under a French law voted in on Wednesday.
Twitch has included four streamers in its new eight-person panel to improve safety on the streaming site as well as shape product decisions. I’m all for involving users — let's see how it pans out.
Amazon Inc's video game live-streaming platform Twitch is forming an advisory council of experienced users, online safety experts and anti-bullying advocates to help improve safety on the site, Twitch said in a blog post on Thursday.
TikTok continues to do nothing when racist content is posted, even by prominent users *sigh*
Teens won't stop posting racist videos and challenges on TikTok. Experts explain why the problem continues.
While people have stereotyped Gen-Z as more 'woke,' experts say that the generation hasn't escaped racism and may be more prone to make racist posts.
Instagram announced pinned comments, mention filtering and multi-comment deletion to help curb online abuse. Proof that everyone is a moderator now.
Today, we’re sharing the fifth edition of our Community Standards Enforcement report which tracks our progress to keep Facebook and Instagram safe. In..
UK-based Alan Turing Institute has released a call for papers for its Workshop on Online Harms and Abuse.
If you got this far, well done. Please enjoy this gem from the NYT comments section as the smallest of rewards. And have a great weekend!
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.