Judging by a handful of new subscribers, it seems that EiM has broken (Ed: perhaps too strong?) into Scandinavia.
So hello to folks from YLE in Finland and Altinget in Denmark, not to mention a familiar face now working at Zinc. Good to have a diverse set of readers - do reply or use the 'enjoy this issue?' button at the bottom to get in touch with your feedback — BW
Trying for transparency
It wasn’t so long ago that Twitter were pushing their live news credentials. Now questions are being asked about their ability to be real-time in a different way: transparency.
This was just one of the themes that came out of Jack Dorsey and Sheryl Sandberg’s appearance at the House Energy Committee hearing, which sounded cordial by most accounts. Bloomberg’s Sarah Frier, who was at the hearing, summed up the mood in a thread:
The focus on transparency in the US is timely since it’s also what UK media bosses joined forces to call for in a letter sent to The Sunday Telegraph this week. Bosses at the BBC, BT, Sky and others put aside their broadcasting rivalries to call for ‘accountability and transparency over the decisions these private companies are already taking.'
By the end of the hearing, Dorsey had agreed to producing a report to make public data about harassment on the platform ‘this year’ but it’s clear that won’t be enough. Users, politicians and organisations alike want real-time transparency, not after-the-fact reports.
It’s what makes Narrative, a new social network currently in beta, so interesting. Set up by the founders of forum platform Hoop.la (I hadn’t heard of it either but it has some notable commercial clients including Bose and Pinterest) one of its selling points is that it elects users to a Tribunal and makes every moderation action visible and searchable.
When you add that 85% of revenue goes to its users, it sounds like the opposite of Sandberg and Dorsey's employers. Both could do worse than sign-up for a beta invite...
Peak content moderation?
The previous spike in 2014? When Wired uncovered the moderation conditions at Facebook's third parties.
Since the election of Donald Trump in 2016, there has been burgeoning awareness of hate speech on social media platforms like Facebook and Twitter.
This 30-something-year-old law is the current legislation that decides whether social networks are liable for what’s posted not their platforms. And it’s lenient.
Maybe someday AI will be sophisticated, nuanced, and accurate enough to help us with platform content moderation, but that day isn't today.
Techdirt, who have written some great stuff on moderation recently, points out that the holy grail of AI-powered moderation it must not succumb to the ‘Scunthorpe problem’.
This article was produced in partnership with Point, a YouTube channel for investigative journalism. It discusses topics that you may find upsetting.
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.