8 min read

Is Article 21 of the DSA making an impact?

A new transparency report from Appeals Centre Europe suggests that platforms aren't exactly playing ball when it comes to out-of-court dispute settlements — just as Alice predicted 12 months ago

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.

I'm in San Francisco this week, where I'm running a workshop for T&S leaders and practicing my bowling skills (hint: they're non-existent).

In today's T&S Insider, I'm revisiting a post I wrote about a year ago: what impact has Article 21 of the Digital Services Act had so far? Some of my thoughts proved correct, others haven't manifested (yet?), and one surprising thing happened that I didn't see coming at all. Drop me a line if have thoughts about today's newsletter.

And I'd love it if you signed up for a webinar I'm speaking on about ethical AI for T&S and sharing ideas that I trailed in All Tech is Human's recent Responsible Tech Guide. Thanks for reading, here we go! — Alice


SPONSORED BY Tech COALITION, strengthening child safety foundations
CTA Image

Not ready for Tech Coalition membership? Pathways is a free program designed to help startups and small-to-medium platforms build stronger child safety programs.

Participants gain access to expert guidance, practical tools, and a curated library of resources on topics like drafting safety policies, meeting global regulatory requirements, detecting CSAM, building law enforcement response teams, and addressing harms like financial sextortion. Pathways also connects practitioners with peers and experts tackling OCSEA across the industry.

JOIN PATHWAYS FOR FREE

Out-of-court dispute settlements, one year on

Why this matters: Twelve months ago, I wrote about how Article 21 of the Digital Services Act might change content moderation through out-of-court dispute settlement bodies. Now we have real data from Appeals Centre Europe's first transparency report, and it largely confirms my scepticism about whether this system could work at scale.

Last year, I wrote about how out-of-court dispute settlement (ODS) bodies might change the way user appeals work.

I imagined a beautiful scenario where concerned citizens could appeal platform decisions en masse, where specialised ODS bodies might help identify gaps between platforms' stated policies and their actual enforcement, and where civil society organisations might use Article 21 to push for more consistent moderation.

I also worried about trolls weaponising the process, bad-faith appeals draining T&S budgets, and platforms being frustrated at having to finance reviewing the same decisions multiple times.

Now that one of the ODS bodies has published their first transparency report, I wanted to revisit my predictions. What turned out to be true? What was surprising? And what does this mean for the future of Article 21?

The numbers are tiny

We need to keep in mind that Appeals Centre Europe (ACE) is only one of seven out-of-court settlement bodies and we don't have case data for all of them yet. But, even so, the amount of data is this report is small; nearly 10,000 disputes submitted across all platforms over ten months, with only about 3,300 falling within ACE's scope.

As an example, ACE received 343 eligible disputes about YouTube content and made only 29 decisions total. That's 29 decisions on a platform with billions of users making countless posts every single day.

I don’t want to minimise the individual impact that the Centre has had for the people that have appealed but you cannot draw meaningful conclusions about platform moderation quality from sample sizes this small.

The selection bias is massive

It’s also important to remember that these aren't random moderation decisions. The cases who make it to an ODS body are as a result of:

  • People who knew ODS bodies existed (only 36% learned about the process from platforms — which is another issue entirely)
  • People who bothered to appeal internally first
  • People motivated enough to seek out and complete external appeals
  • Often, organised civil society campaigns

The vast majority of T&S decisions (spam, bots, obvious violations) are out of scope for ODS bodies or never get appealed this far. We're looking at a hyper-filtered population of edge cases and organised advocacy.

So what does the data actually show?

Here's what I find most interesting: In cases where ACE actually reviewed content (not counting default decisions), the outcomes were pretty evenly split.

TikTok: If we remove the defaults, ACE upheld decisions in two thirds of cases (171 cases). In other words, even among the small group of highly motivated users who thought TikTok was definitely wrong enough to appeal twice, TikTok was actually right two-thirds of the time. For bullying and harassment specifically, the Appeals Centre upheld 100% of TikTok's decisions (though this was only 13 total cases).

Facebook: About 50/50 split between overturns and upholds when content was reviewed.

YouTube: 50/50 split, but based on only 16 reviewed cases where Appeals Centre used the user-provided link since YouTube provided zero original content.

The numbers are small but, to me, this suggests platforms aren't systematically incompetent at moderation. For the tiny slice of cases here, they are making reasonable decisions most of the time, even on genuinely difficult edge cases.

What I predicted correctly 

Platforms would find this burdensome
I predicted platforms wouldn't be happy about financing multiple reviews of the same decisions, and I was right.

  • YouTube has shared zero content despite 343 eligible disputes.
  • 52% of all Appeals Centre decisions were "default decisions" where platforms didn't provide content.
  • Platforms are clearly choosing not to engage rather than participate in a system that costs them time and money.

Last August, I wrote:

"I know that platforms aren't going to be happy about having to finance reviewing these decisions multiple times."

Turns out they solved this by just... not doing it.

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member