There's no easy out for Mark Zuckerberg.
Something terrible happened this weekend.
A rogue gunman used Facebook Live to broadcast three videos, proclaiming and then committing a murder.
The videos were viewed by millions of people, watching in disgust (we hope).
And Facebook was powerless to do anything.
The videos of the tragic act sat on Facebook for over two hours before they were finally removed.
Sadly we’re not terribly surprised, in January The Memo predicted that more violent acts were inevitable in the world of Facebook Like.
What we didn’t predict was Facebook’s pathetically weak response:
“We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.”
It hinted that one day AI might help its human moderators to cope with the photos, videos and live streams being broadcast by its 1.8 billion users.
Today Facebook is powerless to stop such acts of barbarism being broadcast on its network.
“You can never ‘solve’ a moderation problem, only reduce the worst offences,” Charlie Beckett, media professor at the London School of Economics, told The Memo.
“Obviously Live creates more problems in flagging and removing bad content, but the principle remains the same.”
Part of the problem, if you didn’t realise, is that Facebook still relies on an army of thousands of human moderators to manually check every piece of media that’s reported.
It’s a gruelling job, as this harrowing documentary shows:
The other problem is that Facebook’s push into live video is making the moderation problem even harder to manage – passing remarks about violence are challenging to detect, especially when a video is bring broadcast live.
So can Facebook ever really solve its ‘Live’ problem?
Probably not, and even Facebook itself is reportedly rethinking its ‘Live’ strategy – given the acts of violence that have been broadcast in recent months.
As Beckett points out even mainstream broadcasters, with their editorial standards and checks, still make mistakes.
Sometimes offensive content is the price we pay for revolutionary new communications platforms, but in this case I hope Facebook quickly realises its ‘Live’ vision is flawed and unmanageable.