Thanks for reading — BW
Have a break, have a Twitter timeout
An interesting nugget in Jack Dorsey’s interview with NBC’s Lester Holt about his decision to give Alex Jones a timeout (watch a clip): apparently they have been found to 'impact and change people’s behaviour’.
Obviously, the Twitter CEO doesn't go into detail about how many people have received a timeout, the typical length of a timeout (it's anything from one day to a week) and the impact when those users return. And until he does, I’m sceptical about the benefits of read-only mode.
Until recently I headed up a team of community journalists at a UK news organisation, whose job it was to decide on when to make people should take a break for commenting on the site. Anecdotally, we found that timeouts (one month long after a prior warning) only emboldened users once they had their commenting rights reinstated.
Also, a paper last year on online trolling (granted, different to the Alex Jones scenario but certainly related) found that banning users who violated community guidelines didn’t help the situational factors that people’s moods make them susceptible too. What did the paper conclude? Design systems and platforms to mitigate negative behaviours.
Over to you Jack…
Kate was one of two New York Times journalists given a behind-the-scenes look at how Twitter are trying to make the platform safer. It's worth a read.
The scene that is set, of 18 executives spending an hour talking about dehumanising speech with no conclusion, reminded me of a story told by Google Adwords lead (I can't remember where I read it so don't have the link) who was asked to present to Larry Page and other execs about their plans for the forthcoming year. A disagreement ensued and Page asked who was ultimately responsible. Three people put up their hands. Page got up, left and made them sit in the room until they decided who was the decision maker.
It helps when one person is accountable.
Moderating in Myanmar
In April, Mark Zuckerberg told US senators hate speech in Myanmar was a priority and more people would be put in place to review content.
Four months later, Reuters, working with the Human Rights Centre, yesterday published an investigation in which they found over 1000 posts, comments and images attacking Myanmar Muslims.
Some of the comments left up on the platform are staggering. But it's the scale of the operation for over 18 million Facebook users in the country which is most shocking: since 2013, there have been just 60 people based in Kuala Lumpur reviewing content from Myanmar. In April, three Burmese speakers were added to the team in Dublin.
Those numbers don’t add up.
Twitter chief executive Jack Dorsey said he is rethinking core parts of the social media platform
Judging by comments in yet another Dorsey interview (how many is that recently?), expect less attention paid to policy and more time on product
Maybe you’re glad Jones is gone, but content moderation also hurts human rights.
David Greene, civil liberties director at the EFF, warns against a select number of platforms bearing responsibility for moderation without a common framework in place. The Santa Clara Principles, he says, would ensure policies are applied clearly and consistently
While leading engineering at Instagram, James Everingham spotted a problem: the inevitable dip in transparent decision-making
With transparency a hot topic at the moment, it was interesting to read how the Head of Engineering at Instagram spent six months improving transparency in his team. The principles are just as easily applied to users too
Everything in Moderation is a weekly newsletter about content moderation and the policies, products, platforms and people shaping its future.