š When competition laws meet content moderation
Hello and welcome to the 75th edition of Everything in Moderation, a weekly newsletter about moderation on the web and the policies, people and platforms that make it happen.
A note of thanks to those of you who have shared EiM with friends or via social media this week. Iāve had a flurry of new subscribers (iirc, itās the most in a week since I started two years ago) so šš½ folks from Huffington Post, the Premier League, Mangorolla, Banca Etica and elsewhere. Iām glad to have you on the EiM team ā say hi if you get a chance or put in the time for a chat*.
In all honesty, I didn't want to write about this weekās US big tech anti-trust hearing. But so much of the policy and guidelines that we all abide by online starts with the digital dominant platforms, so I kinda couldnāt not. If it's not your bag, don't worry ā there's a bunch of other links in today's edition.
Stay safe and thanks for reading āĀ BW
šŗšø What I took from Wednesday's US antitrust hearing
'Four tech CEOs log on to video call' could be the start a bad pandemic gag. But it's what happened this week as the US House Judiciary Committee invited Alphabet, Facebook, Twitter and Amazon to testify at the Online Platforms and Market Power antitrust hearing.
You may have read the top lines already (this Al Jazeera summary is good if you havenāt) but, as a someone interested in speech policy and moderation, I found three things interesting and thought I'd share:
1/ The conflation between competition and free speech
I will admit to being a bit confused when lawmakers started using the hearing to quiz CEOs on shadow banning and political bias. Surely focusing on big tech's use of data, its approach to privacy and propensity to copy features of smaller sites would be more appropriate? However, as Jameel Jaffar ā the director of the Knight First Amendment Institute āhelpfully explains in this thread, itās not a coincidence that antitrust and free speech issues overlap.
Iām by no means a legal expert so please correct me if Iām wide of the mark but there also seems to be an aspect of the antitrust subcommitteeās remit that looks after what it calls āconsumer welfareā ā that is, issues beyond price and choice. Thatās where, for better or for worse, diversity of viewpoint becomes a consideration for lawmakers.
2/ Conservative bias will not go away
Talking of diversity of viewpoint, it didnāt take long for lawmakers to ask about 'conservative bias' (just 11 minutes by this Washington Post reporter's watch) even though there is no current evidence of such bias. It is a long-held Republican tactic, as Mashable points out, that is unlikely to go away whether Trump loses in November or not. Expect it to continue to crop up and for media, academia and NGOs to have to bat it back down, whack-a-mole style.
3/ Mistakes (and their coverage) undermine the process
Although US lawmakers asked some good questions, there was also the predictable and inevitable gaffes. The most notable of these was Congressman Jim Sensenbrenner asking Mark Zuckerberg about a takedown that happened on an altogether different social network to the one he owns and operates.
These exchanges may be small and brief but they demonstrate that lawmakers arenāt well-placed to extract answers that put digital dominant platforms under pressure to do better.
Worryingly, I don't think that will ever change.
š Never, ever have a gap between community managers
It's pretty common, in UK newsrooms at least, for bosses to cut costs by leaving gaps between community managers. In my experience, someone would leave and months would pass before putting out a job advert and bringing a new face in.
Each time it happened, I had a sense that the gap was bad but I didnāt know the actual impact or have any meaningful data about its effect.
Rich Millington, the founder of the excellent Feverbee, has kindly explained what happened when his own community manager left and he failed to replace her. Read it for yourself (here's the threaded version) but, in short:
- big decreases in time spent browsing
- big decrease in page views
The lesson, here and for all aspects of life: make sure there's always a handover. (Apart from relationships, maybe. That would be bad).
š Not forgetting...
It's taken me a while to get around to reading this from Github's Sponsors product lead, Devon Zeugel, on why there needs to be a digital equivalent of a disapproving stare. But I'm very glad I did.
The silence is deafening | Devon's Site
Imagine you're at a dinner party, and you're getting into a heated argument. As you start yelling, the other people quickly hush their voices and start glaring at you. None of the onlookers have to take further actionāit's clear from their facial...
TikTok's CEO used the antitrust hearing to write a blogpost committing to greater transparency of data flows moderation practices. Iām filing under āI'll believe it when I see itā.
TikTok is opening up its algorithm and challenging competitors to do the same - The Verge
TikTok says itās opening up its algorithm and content moderation work to outside scrutiny.
I feel like this story is an accurate portrayal of where platform moderation is right now: A UK rapperās anti-semitic rants became international news after Twitter and Instagram waited over 12 hours before banning him. He was eventually suspended.
Wiley: Social media websites need to change after Twitter failed to swiftly remove grime artist's antisemitic posts,
Members of parliament and political groups have called for greater regulation of social media sites following grime artist Wiley posting a series of antisemitic remarks.
Three weeks after her last piece on Pinterest moderation (linked in EiM #73), Sarah Emerson at OneZero has spoken to more employees, who explain that child pornography is rife on the platform.
āA Permanent Nightmareā: Pinterest Moderators Fight to Keep Horrifying Content Off the Platform | by Sarah Emerson | Jul, 2020 | OneZero
Moderators reported seeing child pornography content 'every couple hours.'
Twitter (#1) has introduced new link blocking categories for hate speech and violence to remove harmful content before it is posted. According to the announcement, you can fill in a form to argue that your URL is not harmful but thereās nothing about the review process or how long it takes. Another black hole?
Our approach to blocking links
Learn about how Twitter determines what an unsafe link is and what to do if you encounter spam or malware links on Twitter.
Remember the hacking of Obamaās Twitter account (#2)? And that of Apple, Uber, Kanye West and dozens of others? Well, according to a Reuters report, over 1000 people had access to the tool that was used to gain access to the accounts, including third-party Cognizant workers.
Some Cognizant contractors reportedly had access to internal Twitter tools that enable account takeovers
Cognizant is a third-party company used for moderation.
*I have a tendency to come on a little strong, sorry ā an email hello will suffice.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.