April Fools: Facebook continues to have the last laugh
You could be forgiven for believing that technology (and in particular social media) has gotten away from society.
What was once built as a way to enhance communication and bring people closer together has now become the opposite.
One need look no further than the March 15 Christchurch mosque shootings to support this claim, which saw Facebook struggle to contain the spread of graphic live stream footage.
Despite having more tools at its disposal than ever before, Facebook allowed the original live stream to roll for 17 minutes before it was shut down.
17 minutes. Is anyone awake at the wheel?
“We have heard feedback that we must do more – and we agree,” Facebook’s COO Sheryl Sandberg said.
She added, “In the immediate aftermath, we took down the alleged terrorist’s Facebook and Instagram accounts, removed the video of the attack, and used artificial intelligence to proactively find and prevent related videos from being posted.
“In the wake of the terrorist attack, we are taking three steps: strengthening the rules for using Facebook Live, taking further steps to address hate on our platforms, and supporting the New Zealand community,” Sandberg concluded.
So now, almost one month later, what has Facebook done in the ‘aftermath’ of this tragedy?
New Zealand’s Privacy Commissioner John Edwards reached out to the social media giant earlier this week for an update on its live streaming platform.
In his query Edwards, asked, “Is there anything that Facebook has done in relation to the live-streaming service since March 15, which if it had done before [then] would have either prevented the live-stream occurring or ensured it could have been more promptly detected and either halted, or referred for human review?”
The response? “No.”
Wait, that can’t be right? Surely they’d have changed something?
“To their credit, Facebook did answer my direct question,” Edwards tweeted to his followers.
If Facebook is awarded brownie points for answering a simple question, then our expectations need a major overhaul.
Last Saturday in an open letter, Facebook’s founder and CEO Mark Zuckerberg asked governments to have a ‘more active role’ in combating hate speech.
“From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.”
But this is the same Zuckerberg that allowed Cambridge Analytica to acquire data on 30 million Facebook user profiles, to which he said, “It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”
Historically Facebook has not waited for governments to catch up. It’s a disruptor at heart. It can set the example.
17 minutes was 17 minutes too long, it’s time to walk the talk.