12 min read

Kat Duffy on the state of the trust and safety industry in 2023 and what comes next

Covering: what the gaming industry can teach us about online safety and expanding research and capacity in Global Majority countries
Kat Duffy is Senior Fellow at the Digital Forensic Research Lab and director of the Task Force for a Trustworthy Future Web
Kat Duffy, Resident Senior Fellow at the Digital Forensic Research Lab (DFRLab) and the director of the Task Force for a Trustworthy Future Web

'Viewpoints' is a space on EiM for people working in and adjacent to content moderation and online safety to share their knowledge and experience on a specific and timely topic.

Support articles like this by becoming a member of Everything in Moderation for less than $2 a week.

Every week, I read and share reports and whitepapers about different aspects of online safety. Most of them are about specific phenomena; for example, the rise of online hate speech (EiM #70) or the flimsy nature of outsourced moderation (EiM #68). Very few are about trust and safety as an industry and a practice.

That has been the focus of the Task Force for a Trustworthy Future Web. Announced in December last year (EiM #184), the 40-strong group of industry experts has quickly gone about trying to define "the current components that make up both the immersive and digital information ecosystem(s) and the field working to make them healthier and safer".

Last week, following five months of interviews, expert roundtables, thematic discussions, document reviews, and briefings, the Task Force published its report, "Scaling Trust on the Web". There's a lot to chew over: it contains eight key findings, five recommendations and advocates for "investments in systems-level solutions" that both support trust and safety workers and the internet users they protect. All sensible stuff.

Kat Duffy is a Resident Senior Fellow at the Digital Forensic Research Lab (DFRLab) and also the director of the Task Force for a Trustworthy Future Web. She kindly agreed to answer a few questions I had about the report and its main themes. I hope you enjoy her answers.

This interview has been lightly edited for clarity.

One thing that I found different about the Task Force compared to other similar initiatives is the inclusion of corporates (platforms but also vendors). Why was that important?

Platforms and vendors bring tremendous knowledge to the table that hasn’t always been easily accessible. Platforms - and by extension the vendors that platforms hire to support broader trust and safety goals - also have access to the largest data sets, some of the most cutting-edge tooling innovations, and vast amounts of raw data. People who sit inside industry understand the nuts and bolts of different digital builds and the implications of distinct, highly technical choices.

Moreover, trust and safety practitioners inside companies choose to run towards a problem when they see it and to seek solutions. They are the natural allies of those outside industry who are also fighting for safer, more trustworthy, and more useful online spaces, and we believed it was critical to explain their perspective and articulate the complexity of their work.

Vendors also bring a unique perspective because they can see broader trends that are playing out across companies. The vendor community is often invisible to those outside industry, despite vendors being essential to even the largest platforms’ trust and safety efforts. We felt it was important to clarify their role and to identify and illuminate the incentives at play for them as companies themselves.

Moreover, trust and safety teams and vendors constantly navigate broader drivers and incentives structures that undermine building safer, more trustworthy spaces online - like a C-Suite not being required to care, or private equity demonstrating limited interest in trust and safety investments. The more that we can surface that reality, the more we can move towards a more grounded foundational analysis of the challenges at hand, and an understanding of the solutions that could be most effective.

Finally, it is worth noting that we did not seek out representatives of the biggest platforms to join the Task Force. That was a design choice. There is a world of innovation, design, and challenge that exists within smaller to medium-sized companies, and we wanted the Task Force to capture that level of granularity because it is often overlooked.

You highlight gaming as an important area to understand if we are to grasp how digital spaces might evolve. Can you explain why gaming and not, say marketplaces or dating?

We prioritized gaming because we think it’s an under-examined industry with particular relevance to future online spaces, and one that isn’t adequately understood across a broad spectrum of stakeholders, including policymakers, media, civil society, and academia. That’s not to say that folks aren’t looking at gaming; they have been. But the gaming industry tends to be treated as something separate from, rather than a part of, the broader information ecosystem.

Particularly given that most immersive technology is developed through gaming industry companies, we thought it was important to demonstrate that the gaming ecosystem is massive, expanding rapidly, and already part of key online spaces in important ways. We believe those interconnections will only increase in scope and importance.

We delve into this in much greater detail in our annex on gaming, which I would encourage everyone to go read, but here are two areas where we see convergence and a need for greater focus:

  1. Gaming’s isolation from policy communities focused on internet governance, social media, and “big tech” issues has resulted in a lack of appreciation of the gaming industry’s longstanding market share, geopolitical impact, technology innovation, and connection to the rest of the information ecosystem. The gaming community in some cases generated - and in other cases was the first to experience - many of the harms, risks, and challenges everyone is grappling with today in other online spaces. This includes both the social impact of harassment and toxic online behaviors, as well as technical experience operating in multimedia, interactive, and real time spaces. More should be done to understand the unique impact gaming’s intentional design has on trust and safety dynamics, and explore lessons and models that may be transferable, or avoidable.
  2. Gaming is the ecosystem through which the bulk of immersive or XR technology and content has been built, and the industry has been actively experimenting with applications of distributed technologies and AI. Technology built by gaming companies is also increasingly being used to do things like design cars, make movies, and architect buildings. Because of this, understanding the players and how they work takes on even greater importance. This includes building a much better understanding of ownership, incentives, and business models, particularly as countries like China and Saudi Arabia systematically buy up or take majority stakes in many of these companies.

The fact that we chose to focus on gaming over dating or marketplaces is not meant to suggest that those other platforms aren’t important, or deserving of significant analysis and consideration. I want to be clear on that point. Dating platforms and marketplaces have been leaders in tackling trust and safety issues and building online spaces where users can engage safely, in part because they have also been at the forefront of creating the potential for risk and harm over the years. I would love to see deep dives into those spaces, how they’ve evolved, and how they may continue to develop.

Viewpoints are about sharing the wisdom of the smartest people working in online safety and content moderation so that you can stay ahead of the curve.

They will always be free to read thanks to the generous support of EiM members, who pay less than $2 a week to ensure insightful Q&As like this one and the weekly newsletter are accessible for everyone.

Join today as a monthly or yearly member and you'll also get regular analysis from me about how content moderation is changing the world — BW

Deepening research and institutional capacity in the “Global Majority” is a thread that runs throughout. There’s clearly a lot to do. Where to start?

I think we have some great and concrete recommendations on this point in the Task Force report, and I hope people will check them out! I would push back on the concept that we need to “start” though. The depths of the existing research and institutional capacity that Global Majority organizations and researchers already have is demonstrated throughout the Task Force report. So it’s less about starting, and more about paying attention, expanding, and sustaining.

That said, the first, most concrete step I would advocate is for philanthropic leaders and foreign assistance strategists to read the Task Force report, identify where they can respond immediately to implement different recommendations, and then jump on that opportunity. We put a lot of thought into developing recommendations that were bite-sized and actionable, in addition to offering recommendations that are bigger-picture, and I hope funders take advantage of that.

The second step is harder and more profound, but critically important. Global North funders must figure out how they can provide greater autonomy, agency, flexibility, and sustainability to partners in the Global Majority, and more broadly to partners representing marginalized communities. This is an exercise in listening, and in building trust on both sides, for sure. But it also requires a pretty gnarly and thankless deep dive into operational plans, procurement regulations, reporting requirements, proposal solicitations, and other components of funding infrastructure.

For example, if governments are going to pass regulations that push companies to engage with external experts in civil society or academia, then how are governments thinking about the sustainability of those sectors and the way that regulations may be changing incentives and pressures on civil society or academia as a result? I’m not suggesting these are easy questions but we have years of lessons learned here, and brilliant leadership from historically marginalized communities who can offer ideas, insights, and paths forward. So let’s take that seriously, listen, and be creative about how we operationalize new approaches and build new solutions.

A quick look at the Task Force suggests most are affiliated with Global North institutions. How were Global Majority countries represented in the process?

Our Task Force included civil society experts and academics who either run or founded leading organizations or initiatives in Pakistan, India, Kenya, Argentina, Myanmar, and Colombia. We also worked with partner organizations such as Witness and the Global Network Initiative, who bring extensive expertise (and in the case of GNI, membership constituencies) focused squarely on understanding and illuminating the disproportionate harm and risk that communities in the Global Majority face. We also ensured extensive inclusion of reports and scholarship authored by representatives from Global Majority countries in our literature reviews and in the materials we asked Task Force members to review.

The critical importance of Global Majority-based perspectives is not an abstract or token issue for me. I’ve spent more than half of my career based in Global Majority countries. I have lived and worked in South Africa, Colombia, and Cuba, and just recently returned to the United States after three years in Tunisia. It’s important to acknowledge that the broad brush of a phrase like “Global Majority” is still inherently reductive. We’re talking about an incredibly diverse tapestry of needs, concerns, and contexts when we say things like “marginalized populations” and “Global Majority.”

That’s why you’ll see many recommendations in the report pointing to the importance of funding work across a range of countries and communities outside the US, Canada, and Western Europe, and of funding work in a manner that will offer adequate space and support to leaders across the globe to set their own agendas. We have to move beyond Global Majority experts constantly being stuck in a cycle of responding to priorities that are determined by organizations and priorities in the Global North, or of having their relevance and expertise reduced to the equivalent of a case study. It would be far more powerful and effective to support Global Majority experts to identify the questions that those in the Global North should be asking and answering.

The current cycle reflects where power sits. It doesn’t reflect where important knowledge or expertise sits. I can’t emphasize this point enough - we all have a vested interest in learning from those who sit outside entrenched power structures. The most marginalized are often the most impacted by online harms, but they are also often the first impacted. Eventually, those challenges come to all of us. Building better online spaces will depend upon understanding that fact and incorporating greater inclusion and care into tooling, product, policy, and monetization decisions.

At one point, you note how the Digital Services Act has the potential to “also divert attention and resources from the most vulnerable communities and markets”. Can you elaborate on what is meant by that?

First, this point is not intended to pour cold water on the DSA. Our report highlights that the DSA’s passage is a key market driver that is shifting attention to these issues in meaningful ways. But the concern you mention was widely-shared across the Task Force, ranging from industry to civil society reps, and deserves attention as decisions are made about how the DSA is implemented.

As companies face a new era of regulatory requirements and compliance frameworks, financial and legal pressures may incentivize those companies to make the regulatory floor their trust and safety ceiling, and to shift investments away from more proactive or innovative approaches to building T&S (such as prosocial product design methodologies or expanded multistakeholder engagement, research, and experimentation). If the prevailing regulation is requiring protections for certain populations over others (e.g, European users and not Ugandan users), then it’s also likely that limited T&S investments will go towards monitoring the company’s product and impact in the geographic areas subject to regulatory standards.

Voluntary models and mechanisms have been a very powerful component of the evolution of trust and safety. While I don’t believe anyone would say that these voluntary models have been adequate, I know that across the Task Force, members highlighted how important it is for trust and safety teams to have the space and resourcing necessary to be proactive and innovative rather than simply compliance-driven and reactive. If external experts flag disproportionate harm to a vulnerable community, for example, it’s important that trust and safety teams inside of companies can examine those claims and ideally work with affected communities to mitigate that harm - even if those communities aren’t the focus of significant regulatory attention.

One thing that struck me was the extent to which the Task Force appears much more pragmatic about the fediverse and its potential than others. What is that based on?

We made a very early decision to recruit Task Force members who are known for being constructive and optimistic, but who have also been doing this work long enough to be immune to hype cycles, and I think those characteristics are evident in how we approached the question of the fediverse.

It’s not like decentralized spaces are new for those of us who have been around for a while. What is new is the amount of knowledge we’ve gained and can tap into regarding how to build better platforms and better spaces. This Task Force has deep, deep expertise both in how harms promulgate online, and the possibilities and limitations of technological approaches to mitigating those harms, which is why we can say squarely in the report that early product and policy decisions are not values neutral and will impact trust and safety. There’s a fantastic chart in the fediverse annex that shows readers at a glance what a false - and dangerous - assumption it is to assume that trust and safety features available on centralized platforms are immediately or easily replicable in decentralized spaces. That’s one reason we offer so many targeted recommendations for research and innovation focused on the fediverse.

Personally, I’m excited and optimistic about the world of possibility and innovation that federated spaces offer, and I’ve been delighted to see people posting and tweeting and tooting and skeeting about the Task Force report across all sorts of communities and instances. That said, if you’re building or scaling a new online space in 2023, there’s no excuse for magical thinking. Malignancy migrates, and many current trust and safety best practices can’t be deployed easily across decentralized spaces. That’s the reality. Acknowledging that challenge is the first step towards overcoming it, and consequently unlocking all the potential that federated spaces can offer.

I’ve called for better, more nuanced reporting of online speech issues by media so it was heartening to see journalistic capacity mentioned. How do you see that developing?

So, it’s important to acknowledge first that independent media and journalists around the world are under attack and struggling to survive due to plummeting ad revenue, the evisceration of local media, the emergence of news deserts, a crippling lack of sustainable financial models that can support journalism, and increased danger of harassment, detention, or death due to their work. Journalists and journalism are under attack, across the globe, and the first priority needs to be protecting independent media and the importance of a free press in any way that we can.

I do hope, however, that philanthropies and foreign assistance programming can move beyond this idea that “technology” or “digital” lives in one box and “media” lives in another. As we say in the report, that which happens offline will happen online. If you care about media, and the fourth estate’s capacity to report accurately, to support societies, and to hold power to account, then you have to care about how you’re empowering existing journalists and media outlets to connect the dots between existing or emerging technological trends and to reexamine their beats.

You also have to get intentional about mainstreaming tech and technological thought into journalism. If you care about tech, you need to be mainstreaming support for accurate, thoughtful, and independent reporting about tech into your broader efforts. I cannot emphasize enough that this effort needs to be global. There is a great collection of brilliant tech reporters out there, but they are too heavily based in the Global North. I would encourage media and tech funders (and companies) to back global efforts aimed at deepening the bench of journalists and researchers across the Global Majority (and within marginalized communities in the Global North) who can report on the local impacts of platforms specifically, and emerging technology more broadly.

The report ends by calling for seed funding or targeted programming from philanthropic organisations and governments. How much is needed and how urgent is that need?

The need is urgent, but unfortunately, I don’t think it’s as simple as saying we need X million dollars to do the necessary work. My hope is that this report helps philanthropic organizations and governments see how different pieces of the work they already support may be connected to some of the same questions and drivers, and how some new interventions could help unlock existing investments and catalyze greater impact.

For example, we recommend supporting the development of stronger academic centres for this work around the globe, as well as establishing hubs for Global Majority organizations and researchers in centres of regulatory power. Both of those investments would increase the impact of a wide range of existing work, and potentially allow disparate leaders and experts within a region to work more closely with each other.

We’ve said multiple times that we’re battling existing problems at new speed and scale. I think overwhelmingly, the key is to support existing, great work and to build off of it. But we need funding institutions that can operate with greater speed and scale as well if we’re going to meet this moment.

Want to share learnings from your work or research with thousands of people working in online safety and content moderation?

Get in touch to put yourself forward for a Viewpoint or recommend someone that you'd like to hear directly from.