Hello and welcome to Everything in Moderation, your guide to the policies, products, platforms and people shaping the future of online speech and the internet. It's written by me, Ben Whitelaw and supported by members like you.
This edition comes after a two-week break due to a holiday and then an unexpected hospital trip last Friday. Everyone and everything is fine but I'm sorry to those of you who were expecting their weekly newsletter to arrive and who were surprised when it didn't.
Today's newsletter shows, once again, the very tangible impact of incoming online speech regulation, both in the ways that it is shaping platform features as well as how it will shape future legislation.
Before we get into the big stories, I want to welcome a host of new subscribers since the last EiM from Google, Checkstep, Uber, TrustLab, Electronic Arts, Feeld, Mailchimp, Bumble, Shopify, Etsy and others. Your feedback is key to the newsletter hitting the mark each week. Email me or hit the thumbs at the end of today's edition.
Here's everything in moderation from the last seven days — BW
New and emerging internet policy and online speech regulation
A new report has found that the terms and conditions of video-sharing platforms are "lengthy, impenetrable and, in some cases, inconsistent" and "risk leaving users and moderators in the dark." UK regulator Ofcom analysed OnlyFans, Twitch, Snapchat, TikTok, Brand New Tube and BitChute's terms and found each of them was "difficult to read" and would take up to an hour to get through. They also noted that few provide detailed information about content that violates the terms or penalties for breaking the rules.
Helping users better understand platforms' terms of service was one of the key themes at last month's Trust and Safety hackathon attended by dozens of trust and safety workers ahead of TrustCon (I was there too). If I was one of the video-sharing platforms mentioned in the report, I'd be taking a closer look.