4 min read

📌 Macron's false Facebook dawn

The week in content moderation - edition #12

Hello to lots of new faces*, including folks from Technology Review, Nieman Lab, Vox Media, Condè Nast, Deutsche Welle, City Lab, Wall Street Journal, The Years Project, Popula, the BBC and my housemate Steve, who I promise signed up of his own volition.

Do say hi and tell about what you do - it’s nice to know a little about who’s on the receiving end of this missive. I’ve included a short background to EiM at the bottom of this week's edition in case people are curious how it started.

The conference that I was at in Chicago last week (which I’d really recommend) meant I didn’t get round to sending an edition last week. Upside: there’s lots to get at this week.

Thanks for reading — BW

*I realise there are inherent issues with describing people who sign up to a newsletter as ‘faces’ but I like the phrase too much not to use it.

Emmanuel's moving in

The way Emmanuel Macron talked about the fact that French government officials would spend six months poking around Facebook, you’d think fixed it the whole problem.

Yes, it might be the first time anyone has got Zuckerberg to agree to this level of access but ‘innovative experiment’, ‘unprecedented field experiment’ and ‘world first’ hardly undersold the initiative.

It came as no surprise then, just a few hours after Mr Macron's speech  (full text here) was reported, to see a very different picture starting to emerge. Contact with officials would take the form of ‘regular’ meetings designed to help the regulators ‘familiarise (themselves) with the tools and processes set up by Facebook to fight against hate speech’ (that’s from a Facebook spokesman). So not an embed, as many reported. And obviously no numbers regarding the regulators having to travel from Dublin to California to the Philippines (my bet is less than half a dozen).

There are several other holes, namely that the visits/meetings should really be taking place now and not in two months. If you’ve ever had a school inspection (there’s a system in the UK known as Ofsted and in France they have IGAENR), you’ll know that the whole school improves during that one or two week period, meaning inspectors don’t get an accurate idea of what teaching is like (and 'outstanding' reviews go up).

To combat that, Ofsted started introducing drop-in inspections so teachers didn’t have months to prepare special lessons or implore children to be well-behaved. Which is exactly what Facebook is going to end up doing between now and January, internally and externally. If French regulators wanted to come up with impactful proposals, they’d have gone in straight away.

There is a risk too that these visits/meetings will incentivise Facebook in the wrong way, like a student cramming for an exam. It's something that Ofsted has to review after their visits were found to create a ’teach-to-the-test’ mentality’ which has proven to be damaging to children.

Imagine if Mr Macon's governmental visit is a false dawn and end up having a detrimental effect on the content moderation discussion? Ol' Emmanuel would be less smug then.

Taking an artisanal approach

I’ll be honest, I'm yet to get through all of Data Society’s Content or Content Moderation report, out last week. But I do like that, even in the short passages I’ve read, suggest that companies/brands that care about their users (Medium, Patreon, Vimeo) take a more artisanal approach to moderation.

If you've read it and have thoughts, let me know. And if you don’t want to read it all, a synopsis by Nieman Lab has been done here.

Not forgetting

If you’re going to rely on ‘industrial’ moderation, as the Data Society report termed it, at least make sure some checks and balances are put in place. Tumblr didn’t and child pornography slipped through their system, leading their app to be removed from the Apple Store

The reason Tumblr vanished from the App Store: Child pornography that slipped through the filters

A month or so ago, Instagram announced they’d beef up their AI moderation capabilities. That’s now live and in place as of Tuesday. Worth noting to see if anything changes during your mindless scrolling...

Instagram is Deleting Fake Followers and Likes

Photography and Camera News, Reviews, and Inspiration

Turns out even the so-called good guys have questionable backgrounds. Freedom from Facebook, a supposed groundswell campaign, has been funded to the tune of $400,000 by a guy called David Magerman who believes Zuck has too much control over world communication

David Magerman: 5 Fast Facts You Need to Know

David Magerman is a millionaire former hedge fund executive who sued Robert Mercer and is funding a campaign to break up Facebook.

The background

Everything in Moderation started after the flood of Alex Jones news this summer and a realisation that there wasn’t one single place I could go to satisfy my interest in content moderation and the policies, people and platforms that made it happen.

In my last job, I headed up the audience function at The Times (including its moderation team) and was confronted with questions of policy, anonymity, hate speech and increasing meaningful involvement (while making it pay). So I decided to give this newsletter a go myself.

If there’s anyone that you think would like to read it, I'd be eternally grateful if you forwarded it to them or shared this link via any (reputable moderated) social network.

Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.