📌 Policing content online, Turkish fines and 4chan’s moderator-in-chief
Welcome to Everything in Moderation, your weekly newsletter about content moderation on the web, democratically (s)elected by me, Ben Whitelaw. A special hello to a dozen new subscribers — do hit reply and tell me about yourself.
Today’s edition is intentionally light on US politics because clearly there's a lot still in the balance. I'll do a round-up of all the major election moderation talking points next week.
In the meantime, you can revisit this January edition of EiM about how Joe Biden doesn’t understand moderation (#48) if you want a sense of what the next four years may hold.
Onto this week's links — BW
📜 Policies - company guidelines and speech regulation
It dropped just after I hit schedule on last week’s newsletter but this discussion paper from think tank Demos — also titled 'Everything in Moderation' — looks at the role of content moderation in a healthy online environment.
The paper draws heavily on policing and social order (an area Demos has covered extensively in the past) and doesn’t mince its words: it calls platform governance 'authoritarian at worst and at best a subjugation of public values before commercial interest’ and designed ’to preserve order through closely-controlled, unaccountable and top-down policing of content and behaviour'.
To its credit, it goes on to outline a raft of solutions to help get us out of the mess that we're in: transparent and trackable cases, making admins more accountable, incentivising moderators accordingly and more. Read it in full if you can.
Some news from Turkey, where the government has fined six social media platforms — including Facebook and YouTube — 10 million lira ($1.18m) each for not complying with a new social media law that means platforms have to appoint an in-country representative and remove “offensive” content within 48 hours. Failure to comply with the regulation soon means that platforms might lose the right to make money via advertising and could have their bandwidth slashed by up to 90%.
💡 Products - features and functionality
This came out a few weeks ago but popped into my timeline just yesterday: Playstation 5 users will be able to report verbal harassment via 40-second clips when the console launches in November. The news was spotted in a PS4 system update (PS4 users will be able to communicate to newer console owners) and hastily explained by the company’s head of Global Consumer Experience in a blog post. Moderating audio, as I mentioned in EiM #84, is a new frontier for moderation but difficult to do (see Clubhouse, Twitter). Will be interesting to see how this pans out.
💬 Platforms - dominant digital platforms
You won’t be surprised to know that I’m not an avid listener of The Joe Rogan Experience so I missed Alex Jones’ recent appearance and his inevitable lies about masks and Covid-19 vaccines. It has reignited the debate about Spotify's duty of care and the company's crossing over from 'infrastructure towards being regarded as a publisher'. This Vulture piece has more on the whole episode.
Another foreseeable development: Zoom announced that it is adding staff to its trust and safety team to better deal with content issues. It follows an incident in which it blocked a university webinar featuring a member of the Popular Front for the Liberation of Palestine, which the U.S. government has designated a terrorist organisation.
👥 People - those shaping the future of content moderation
Vice has an interesting profile of an anonymous 4chan moderator — a military veteran with an interest in guns and Warhammer — who became the manager of the site’s ‘janitors’ (the term given to mods) in 2016. It’s notable because his appointment, the piece claims, led to a 40% spike in racist and violent language and helped entrench 4chan’s reputation as a cesspit of hate and bigotry. If nothing else, it’s a reminder of the power of moderators to shape a platform in their own image, when given the chance.
🐦 Tweets of note
- "They are highly confusing” - Vice reporter Jason Koebler got hold of Facebook's internal content moderation guidelines on voter suppression and it's a spaghetti mess.
- "Who is doing smart work on that?” - Stanford Platform regulation director Daphne Keller asks who is addressing the privacy and anti-competition aspect of content moderation.
- "Would love to be a fly on the wall” - CNBC tech correspondent Sam Shead wonders what it would be like to be in room when Twitter decide to place a warning on one of Donald Trump's tweets.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.