3 min read

You can now apply for the hardest job in content moderation

The week in content moderation - edition #35

Hello everyone and welcome to a handful of new subscribers from the WSJ. I said last week I’d touch on Facebook’s Content Oversight Board so that’s what this week EiM is all about.

A special thanks to Jenny who bought me not one but five coffees (!). I’m blown away by the gesture and loved hearing that EiM is useful in her day-to-day work. Drop me a line if that’s you too.

Thanks for reading — BW


Facebook's 'digital constitutionalism'

Last week, a job suddenly appeared on LinkedIn: Director of Board Administration, with the Governance team at Facebook.

For all the hype around the Content Oversight Board, the ad for this key role was understated: A couple of generic paragraphs followed by a short list of responsibilities and a few bullet points about the preferred qualifications of applicants (legal professionals without 12+ years of legislation and litigation experience need not bother). In many senses, it failed to capture to Sisyphean task ahead of whoever gets the job. Details are light and the pressure is significant.

Facebook’s Content Oversight Board is, as Evelyn Douek notes, ‘an unprecedented innovation in private platform governance’. Its remit is not insignificant (interpreting community standards, instructing Facebook to allow or remove content and issue explanations of the board’s decisions) and there’s no structure like that at any company doing content moderation, let alone one with over 2.2 billion users.

The most striking thing to me about the announcement is how the Oversight Board has become more than perhaps Zuckerberg et al expected. Douek notes that the six-month consultation shifted Facebook’s idea of how much policy input the board should have from a little to a lot (She’s also written a paper on it) while researcher Quirin Weinzierl said it will 'ultimately not be able to avoid expanding its mandate’ into policies around automated content and criminal activity on the platform. That shift in emphasis, if it does happen, is something the Director will no doubt have to carefully manage.

The reaction to the draft has been lukewarm, mainly because there’s not much that we didn’t know, but it is notable to me that Article 19  and EFF both made it clear that fair policies and solid implementation should be Facebook's priority. TechCrunch is also very scathing, calling it a 'cherry-picked appeal layer that will only touch a fantastically tiny proportion of the content choice’. You could say it's Oversight and out of mind.

We’ll no doubt return to the Oversight Board when the inaugural board members are announced. For now, just spare a thought for the incoming Director of Board Administration.

Bonus read: Six questions about Facebook's planned oversight body (FIPP)

Reuniting the banned

It’s not been a great couple of weeks for Twitch as several popular streamers were banned, some for more legitimate reasons than others:

Neither type of story is good: either Twitch is seen to not police its guidelines properly or polices those policies correctly but has sexist/transphobic users. And as Mixer (Microsoft’s new game stream rival) gathers steam, where will this all end up?

Not forgetting...

There is a lot in this recent Guardian report about Facebook moderators in Berlin but one bit stuck out: that workers don’t feel they can go to a union for help (see EiM 18)

Revealed: catastrophic effects of working as a Facebook moderator

Exclusive: Job has left some ‘addicted’ to extreme material and pushed others to far right

Politicians #1: Violent government leaders can tweet as long as they stick to the rules, says a Twitter exec at a Senate hearing

Twitter exec says it's OK for autocrats to have accounts as long as they follow its rules

At a Senate hearing, a Twitter executive told lawmakers that accounts are removed only for violations of policies on their own platform.

Politicians #2: Facebook have explained why elected officials can break community guidelines and still not have their content removed

Facebook says it won’t remove politicians’ posts for breaking its rules - The Verge

Facebook has clarified that it will treat politicians’ speech as "newsworthy content" that can be left up even if it breaks rules. The policy doesn’t apply to advertisements

Good reporting from Katie Notopoulos at Buzzfeed News, whose conversations with moderators from Accenture and Cognizant  say the outsourced model is ’broken’. The craziest thing? Moderators marked an Onion article about John McCain’s tumour as ‘cruel’. RIP satire

Newly Leaked Facebook Documents Show How The Company Sets Up Its Moderators To Fail

Muddled communications from Facebook and a barrage of constant updates make low-paid outsourced moderators’ jobs impossible.

A consortium set up to prevent extremist and terrorist content online by contributing hashes to a shared database will expand to include Amazon, LinkedIn and WhatsApp.

Amazon, LinkedIn, and WhatsApp join big tech anti-terror coalition for fighting “extremist” content online

This essay neatly sums up the challenges with Section 230. Verdict? The immunity it grants is too broad.

It’s Time for Platforms to Go Beyond Section 230 - OneZero

As the 2020 U.S. Presidential election approaches, it’s time to rethink online speech and the limits of section 230.


Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.