'Getting To Know' is a mini-series about what it's really like to work in trust and safety, in collaboration with the Integrity Institute.
When we think about the people protecting the internet from harm, we tend to conjure an image of someone working for a major social media platform, perhaps drawing up policy or more likely trawling through offending content or user reports.
That's a natural connection to make; platforms are where the users have always been, where the risk of harm is most pronounced and where the media coverage (however limited it tends to be) has focused its attention. But it is also one that is changing.
The origins of trust and safety go back almost two decades (this Data & Society podcast is fascinating on just this) and naturally, some of those early advocates have left the technology companies and platforms where they learnt the ropes. The skills they have developed, now in demand more than ever, are sought after in new realms. Venture capital, for example.
- The value of working both inside and outside the major platforms during your career
- Backing the people with "novel ideas about how to improve the internet"
- The crucial role of non-profits in "impact[ing] the most pressing issues around technology"
Getting to Know is a series of Q&As with people that have experience working in trust and safety, in collaboration with the Integrity Institute. Previous Q&As include:
If you enjoy this Q&A, get in touch to let me know or consider becoming a member of Everything in Moderation for less than $2 a week to support the creation of other articles like this.
This interview has been lighted edited for clarity.
What's your name?
What is it you do?
I invest in early stage technology startups at Link Ventures, including companies that are reimagining key components of the social web. I've held roles like product strategy and go-to-market at Meta (formerly known as Facebook), working on teams that address misinformation in advance of the US 2020 election and share privacy-protected data with academics and policymakers.
How did you get into the industry?
My master's thesis at the Oxford Internet Institute was technically 'integrity research' although the industry wasn't formalized at that time (2010). I worked with the Cambridge Psychometrics Centre years before their work was, unfortunately, connected to Cambridge Analytica. I have always been interested in how to configure online interactions to optimize for specific behavioural outcomes, which when adopted by bad actors can lead to undesirable effects like the spread of misinformation, bullying and harassment, etc.
What are your main responsibilities?
As an investor at Link Ventures, my job is to source, assess, and invest in early-stage technology startups. I am particularly interested in companies that help improve the state of online interactions through features like privacy-protected data sharing, decentralized identity, trust and safety as a service and others.
What does a typical day look like?
Every day is different, but it's a mix of research, strategic planning for our fund, and meeting with entrepreneurs.
What do you enjoy most about your role?
That I'm able to work on integrity issues from outside the major platforms by investing in startups in the space and contributing to relevant research through my affiliation with the Berggruen Institute, where I’m currently a fellow.
What do you keep track of to understand whether your work is, well, working? How can you tell you're making an impact?
I keep track of whether the products I worked on at Meta shipped, consult on relevant trust and safety policy issues as part of the Integrity Institute and support entrepreneurs making the internet safer. Efficacy of actions is determined at a lot of levels, and not always the content level. In my new role, I'm seeding people with capital who have novel ideas about how to improve the internet and will see how that pans out down the road.
What question are you grappling with most right now?
I'm creating my own map of what I’m calling a ‘social web stack’ — this includes things like governance, data privacy, data sharing, trust and safety, decentralized identity and others — and mapping those to new startups. I'm grappling with how to define the next iteration of the social web.
What do you wish the world understood about your work?
I don't like speaking in broad strokes but in my last roles working in trust and safety at Meta, I wish people understood that problems they experience or read about on the platform don't always have a simple or obvious solution.
What was the last thing you read that resonated strongly with you?
How do you spend time away from work?
Playing with my dog, volunteering with the Integrity Institute, hiking in upstate New York.
Question from fellow Integrity Institute member Jen Weedon: What's been your biggest learning working in this space, and what did you gain from it?
My biggest learning is that people who enter trust and safety come from different backgrounds and have different skill sets, I’d say more so than other teams I’ve worked on. Understanding teammates’ mental models and approaches to problem-solving have been key to making sure my contributions were positioned correctly so that they would be considered by other teams. While I tackled problems using a first-principles approach, likely due to my training as a social scientist, not everyone operates that way and it’s important to recognise.
Before we go, what advice would you offer someone wanting to do your job?
There are many opportunities outside of large technology companies to influence critical trust and safety issues. Joining nonprofits — like the Integrity Institute — think tanks, advising startups, and conducting independent research are just some of the ways that you can have an impact. I think sometimes, folks at large companies (myself included) can think that it’s the only place they can work on, and impact, the most pressing issues around technology and society - but it’s just not true!
What question would you like the next person to answer?
What motivated you to work in integrity/trust and safety?