8 min read

Agne Kaarlep on preparing for and complying with the Digital Services Act

Covering: interpreting the ambiguities of the DSA, establishing data infrastructure for transparency reporting and preparing for legal challenges
Photo of Agne Kaarlep from Tremau next to graphical typeface that says Viewpoint
Agne Kaarlep, Head of Policy and Advisory at Tremau

'Viewpoints' is a space on EiM for people working in and adjacent to content moderation and online safety to share their knowledge and experience on a specific and timely topic.

Support articles like this by becoming a member of Everything in Moderation for less than $2 a week.


The Digital Services Act is, according to one law firm, "most important redefinition of the rules for offering online content, services and products to consumers in the EU in the past 20 years."

You might think that's sales talk, designed to convince businesses to take up expensive legal services, but we're already seeing signs of its impact. Very Large Online and Search Platforms, for example, have had to abide by new obligations since April this year and, while many have steered clear of any issues, others have fared less well (EiM #220).

With the biggest platforms up and running, attention now turns to a broader range of intermediaries. In just three months time — on 17 February 2024 — the DSA will be become applicable to a whole range of services — including internet service providers, providers of web-based messaging, hosting and email services, social media networks of all shapes and sizes, app stores, marketplaces and search engines — across the EU.

Agne Kaarlep is firmly aware of the deadline. In her role as head of policy and advisory at Tremau, she ensures intermediaries are suitably prepared for regulations such as the DSA and the UK's Online Safety Act through impact and risk assessments. Prior to that, she worked at a policy officer at the European Commission.

Ahead of a webinar designed to help demystify the DSA, Agne kindly agreed to answer some questions on the Act, why transparency is so key and how intermediaries of all sizes can prepare for the the deadline and beyond.

This interview has been lightly edited for clarity.


Your work at Tremau involves helping to prepare platforms for the Digital Services Act. What themes are you seeing crop up in those conversations? And what have platforms struggled with to date?

The DSA is a first-of-its-kind regulation, focusing on moderation not at the “content level” but at the “process level”. This paradigm shift induces a number of challenges for online platforms that will, in many cases, need to significantly rethink the way in which they had built their T&S operations. As we work with platforms — from small and medium-sized services to Very Large Online Platforms (VLOPs) — we have seen a number of recurring themes appearing time and again.

The first of these is linked to the scope of the obligations: the DSA proposes a scaled approach, requiring more from the largest platforms than from others. There are also some exceptions to small and medium-sized platforms. This often leads to confusion in terms of which obligations fall onto specific services; for example, co-operation with designated trusted flagger organisations and out-of-court dispute bodies is not solely reserved to VLOPs and Very Large Online Search Engines (VLOSEs), but is an obligation for all online platforms.

A second issue we often hear about is the difficulty for companies to set up moderation processes that are compliant, scalable, adaptable to future regulatory changes, and designed for the extensive data collection required for transparency reporting. In many cases, moderation processes have been built “on the fly”, with little to no process centralisation, meaning that platforms find themselves with a number of internal and external workflow tools, each often used for a specific portion of the moderation process. In a regulatory environment which focuses on consistency and transparency of content moderation, this will be a challenge.

Another element which platforms grapple with is how to know when something is "good enough" as there is plenty of ambiguity in the DSA regarding how the obligations can be fulfilled. Take statements of reasons as an example; while sending such statements should be applicable to all content moderation decisions, many questions remain around how much information needs to be disclosed to the users and what exceptions can be considered. The varied ways of approaching this leave platforms guessing as to if they hit the mark and users are likely to see different approaches across different platforms.

Transparency is a big part of the DSA and VLOPs and VLOSE recently had to submit their first transparency reports. What did we learn from those?

That transparency reporting is hard, even for VLOPs. And it cannot be done without the right tools and systems in place. We will be publishing out initial analysis of these reports but I will share a few key highlights.

Transparency reporting is crucial for communication with users, civil society and researchers. It goes beyond merely communicating with regulators. The key word here is explainability: ensuring meaningful explanations of the quantitative data, automated tools, and accuracy indicators. We have seen from the VLOP reports that this was not always easy, highlighting that the effort needed for this should not be underestimated.

Internally, platforms also need to ensure that their data infrastructure is set up for success. We find that ensuring a specialised data pipeline that generates, stores, timestamps, and isolates all data needed for the DSA requirements makes reporting cleaner and scalable.

Also, having a credible and comprehensive methodology of calculating efficiency and effectiveness metrics across your company is key. We recently hosted a webinar with Jeff Dunn [Vice President of Trust, Safety and Support at Hinge] and he pointed out that, due to legacy data infrastructure that often comes from a time when platforms were smaller and building “hacky” solutions, assessing data can become “comparing apples to oranges to giraffes”.

Thankfully, there is some additional guidance in the works, as the EU Commission will be publishing an implementing act that will include the format of transparency reports. It will very likely also add details on how to interpret the data. This will likely standardise future reports as the current approaches taken differ greatly from one another.

We’ve seen some companies — notably Amazon and Zalando — fight VLOP status. Can you explain why? Are the transparency obligations part of the reason?

The cases brought by Amazon and Zalando point to some more fundamental objections with their designation than transparency reporting. While the two cases are slightly different, both argue that they should not be counted as VLOPs as a large part of their business is retail, which alone is not part of the DSA. It will take time for the courts to decide and, in the meantime, both platforms are complying with the law, with Amazon receiving an interim stay by the court only for one obligation — the ads library.

BECOME A MEMBER
Viewpoints are about sharing the wisdom of the smartest people working in online safety and content moderation so that you can stay ahead of the curve.

They will always be free to read thanks to the generous support of EiM members, who pay less than $2 a week to ensure insightful Q&As like this one and the weekly newsletter are accessible for everyone.

Join today as a monthly or yearly member and you'll also get regular analysis from me about how content moderation is changing the world — BW

So what resources and training should trust and safety teams need to ensure compliance with the DSA's provisions?

In ensuring compliance with the Digital Services Act (DSA), it's imperative that trust and safety leadership within every company develop a robust awareness of the DSA's obligations that apply specifically to their operations. Gap assessments, like the ones we provide for our clients, are needed for a clear understanding of any issues that currently exist in meeting these obligations.

As we look towards 2024, it's important to keep in mind that the introduction of new user reporting mechanisms and the necessity to send out statements of reasons, complete with links to appeal flows, will likely lead to an increase in the volumes of both reports and complaints, something that the VLOP transparency reports confirm. This needs to translate into resource and capacity planning for companies, such as additional headcount or investing in new technologies.

Finally, many DSA obligations are aligned with best practices and therefore, this compliance effort can — and in my opinion should — be seen as an opportunity for process improvement. It should be leveraged as a comprehensive health check across all content processes to not only align with regulatory standards but also to gain efficiencies and enhance user experience.

As platforms ensure compliance with the DSA, how would you recommend services communicate to their users regarding policy changes as a result of the DSA?

Many VLOPs have been sending users pop-ups of changes following the DSA in user-friendly language, which is a good way to go about it. Another option that is currently being used is a dedicated post that captures the changes — see, for example, examples from Snap and Meta.

As is often the case in trust and safety, risk prevention is the best strategy. A significant portion of the DSA is actually about preventing lengthy legal disputes by providing ways in which disputes can be resolved directly between users and platforms, through complaint mechanisms.

The key is to ensure that these processes are not only compliant with the DSA but also deploy best practices in the implementation of complaint funnels. We have looked at the research and it shows the importance of deploying mechanisms where users perceive they are fairly treated. That can also reduce recidivism and therefore burden on the notification and complaint process.

What one tip would you give someone tasked with figuring out DSA compliance in their company before the 17th February deadline?

As we approach the 17 February deadline for DSA compliance, my foremost advice for anyone tasked with navigating this in their company is unequivocal: if you haven't started yet, now is the time! We’ve learned that the most effective initial step is to conduct a comprehensive gap assessment which helps prioritise the compliance effort.

Moreover, embarking on DSA compliance can be a complex journey. If you're at a point where you're unsure about where to begin, or if you've already started but find yourself grappling with challenges, my suggestion is to reach out for assistance. Whether it's seeking expert advice, consulting with peers in the industry, or leveraging available resources, remember that seeking help is a sign of proactive management, not a weakness. It can save you and your team many a headache, as well as make sure that you’re being compliant and not running into any legal risks.

What’s the best way to stay informed about the DSA and best practices relating to its implementation?

There is a lot of content out there about the DSA, some more and some less helpful. We put out quite a bit of content which provides operational insights and have also put together a transparency reporting tracker so companies can browse through all the VLOP transparency reports to date. Julian Jaursch from SNV whose expert pieces on DSA enforcement and the authorities involved are a great read. Also anything by Martin Husovec, an LSE professor, who has some great analysis on the DSA and the legal ecosystem in which it sits. There is also more and more public content around this – the Commission Transparency Database collects statements of reasons from VLOPs daily, and these can work as templates to see how the VLOPs are complying with this obligation.

Stay informed, stay up to date, and stay curious! Many regulators and T&S professionals are super open to speaking about these topics, so seek out forums, conferences, and webinars - we even have a dedicated DSA series- where you can make those connections.

Finally should EiM readers be worried about being called out by Thierry Breton next year for failing to comply with the DSA. Seems like he’s going after everyone…

In reality, you never know who’s the next headline… while not a guarantee against being singled out by any means, I remain a firm believer that the best bet for not being in the headlines for non-compliance is to do your best to comply.


Want to share learnings from your work or research with thousands of people working in online safety and content moderation?

Get in touch to put yourself forward for a Viewpoint or recommend someone that you'd like to hear directly from.