5 min read

New UK child safety proposals, watermarking is 'inadequate remedy' and Alexios appointed

The week in content moderation - edition #247

Hello and welcome to Everything in Moderation's Week in Review, your in-depth guide to the policies, products, platforms and people shaping the future of online speech and the internet. It's written by me, Ben Whitelaw and supported by members like you.

This week sees major announcements across each of the four key areas covered week in, week out here at EiM. Whether you're here for the policy developments, curious about how platforms are tackling (or failing to prevent) online harm or simply interested in the people shaping direction of online speech and safety, I hope you find something useful or interesting.

If you're receiving the Weekly Review for the first time — especially folks from Thorn, The Technology Coalition, Hinge, Vanderbilt University, Apco Worldwide, and Citizen Digital Foundation — a special hello and welcome.

Here's everything in moderation from the last seven days — BW


Today's edition is in partnership with Tremau, a leading T&S provider delivering a unique combination of moderation software & advisory services

Struggling to stay compliant in a fast-changing regulatory environment without overburdening your T&S operations? With Nima, Tremau’s end-to-end content moderation platform, online platforms can streamline their processes, removing inefficiencies and overcoming compliance challenges, while maximising T&S RoI.

Tremau also assists platforms globally - from start-ups to VLOPs - assess the operational implications of the changing regulatory obligations, undertake risk assessments, and prepare detailed mitigation plans.

Protect your users effectively and efficiently & stay compliant.


Policies

New and emerging internet policy and online speech regulation

Ofcom this week released its draft Children’s Safety Code of Practice which outline the duties of services accessed by children under the Online Safety Act. As well as requirements for age checks and better visibility of content via recommender systems, there are also some interesting elements related to internal governance that we haven’t seen in other regulation; namely requirements for “an annual senior-body review of all risk management activities relating to children’s safety” and an employee Code of Conduct to set standards for staff working on online child protection. Will that kind of organisational process help? Who knows. But Ofcom clearly think so.

Worth noting: Despite the strict measures suggested by Ofcom, a group of parents whose children suffered or died as a result of social media platforms are not happy. Speaking to The Guardian, they say the UK regulator has failed to properly engage with them and had made it hard “for those with lived experience to understand or respond to the proposals”.

EU regulatory matters now: although not large enough to be designated a Very Large Online Platform (VLOP) under the Digital Services Act, Telegram will be monitored by Belgian authorities and responsible for addressing EU complaints. I wasn't aware that platforms headquartered outside the bloc with a EU presence had to have legal representation in one of the member states. And because Telegram is based in Dubai, it has chosen Belgium for its legal representation. All I’ll say is: good luck to the Belgian Institute of Post and Telecommunications (BIPT), its soon-to-be appointed Designated Service Co-ordinator.

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member