📌 Suing racist users, WSJ's Facebook Files and a move for Beauchere
Welcome to Everything in Moderation, your weekly roundup about content moderation, curated and produced by me, Ben Whitelaw. Hello to new subscribers from TaskUs, The Centre for Internet and Society and The Royal Society.
Today's newsletter has a fresh new look and also a new home — everythinginmoderation.co — where you can read every edition of EiM going back to 2018. Thanks to the team at Ghost for helping me get started and to all of you for subscribing, supporting and sharing over the last few years.
There is lots more to come from EiM over the coming months. For now, here's this week's jam-packed edition— BW
📜 Policies - emerging speech regulation and legislation
The UK government's reshuffle means Nadine Dorries — the new Department of Culture, Media and Sport secretary of state and the fifth (or maybe sixth) person tasked with birthing the UK's Online Safety Bill — has a lot in her in-tray:
- A new poll of 2000 British adults carried out on behalf of Compassion in Politics, FairVote UK, Glitch and Clean Up The Internet has revealed freedom from abuse should be a greater priority than allowing people to say what they want online.
- "The draft Online Safety Bill gives too many powers to the Secretary of State" according to the Carnegie Trust, in the first of a series of blogs on the bill.
It's been a busy week for the Oversight Board, the independent-but-Facebook-funded body that oversees Mark Zuckerberg and his content moderation decisions:
- A case judgement published on Tuesday accused Facebook of censoring Palestinian content as a result of pressure from the Israeli government. Emma Llanso from the Center for Democracy and Technology posted a good thread on Israel's Internet Referral Units and the blurring between government takedowns and terms of service enforcement.
- It announced three new cases, including a user appeal to restore a picture of ayahuasca on Instagram that was removed under its Community Standard on Regulated Goods. Gap year travellers will surely be taking note.
- It also raised concerns about the management of high-profile accounts following revelations published as part of this week's much talked about Facebook Files (see Platforms).
Finally, The Markup has written a strong piece looking at Facebook's "pervasive" attempt to convince the world it is pro-internet regulation, drawing parallels with other areas of tech, such as facial recognition, and questioning Big Blue's motives. My read of the week.
💡 Products - the features and functionality shaping speech
I really don't want this section to become a graveyard for online spaces no longer with us (see last week's EiM #127) but today we must ring the death-knell on another once-thriving community.
British video game website Eurogamer closed its forum this week, citing changes to "the way people communicate" online and the rise of Discord, Twitter and Twitch. It came just a few months after a new code of conduct and moderator was introduced.
But, as this heartfelt piece by Luke Plunkett for Kotaku explains, forums are:
are more deliberate, more considered, and while they’re far from perfect—I’m sure you can post a billion examples of people being neither deliberate nor considered on forums—the point is that they’re more permanent.
Gone but not forgotten.
💬 Platforms - efforts to enforce company guidelines
You can't have gone about your online dealings this week without hearing about the Wall Street Journal's investigative series on Facebook's failures to address harms caused by its platform. The Facebook Files sounds like a Mulder and Scully reboot but the reporting is much darker than anything the special agents had to deal with:
- 56 million people viewed "revenge porn" posted by Brazilian footballer Neymar because he was protected by Facebook's VIP tool, XCheck.
- A Polish political party posted more negative and extreme content as a result of its 2019 algorithm change, creating the perfect environment for hate speech.
- Mexican drug cartels flouted guns and were known to recruit hitmen on Facebook but were not banned from the site.
The series has been accompanied by a flood of reaction on Twitter, none more interesting than Facebook's former Civic Integrity Lead Samidh Chakrabarti, who criticised the company's "launch early, fix later" ethos and posited some ideas about what a more transparent VIP user process could look like. Proof, if nothing else, that many good people are trying their damnedest to make platforms the best places they can be.
Twitch has upped the ante in the battle against "hate raids" on the streaming platform by suing two users alleged of violating its service. I won't name them because that's probably what they want but the lawsuit notes that they are based in the Netherlands and Austria, with one reportedly responsible for 3,000 bots associated with the hate raids.
👥 People - folks changing the future of moderation
Snap, the parent company of Snapchat, doesn't appear too often in EiM because it deals with contentious moderation issues pretty well. That's in part due to a host of very experienced Trust and Safety staff, some of whom subscribe to EiM.
That team got even stronger this week as Snap announced the hire of Jacqueline Beauchere as Global Head of Platform Safety from Microsoft, where she had worked for 21 years. Vastly experienced, Beaucheree was latterly Chief Online Safety Officer and represented Microsoft on, among other things, the Global Internet Forum to Counter Terrorism (GIFCT).
Beauchere will report to Jennifer Stout, Snap's VP of global policy who told Reuters this week that the company was working on revamping its in-app reporting tools to give "more detailed updates" and finding ways to prevent children under 13 from using the app. Seems like a very good hire.
🐦 Tweets of note
- "All solutions come at a price for social media sites" - The BBC's Marianna Spring highlights the downsides of social media verification in this piece for BBC Scotland.
- "Community moderation is a whole job" - Sydette Harry on the trauma of dealing with racist abuse as a volunteer mod.
- "If you struggle to moderate trolls, take care lest you become a troll" - something to ponder over the weekend from Alex Feerst.