Facebook, Twitter, social media is built for sharing anything. Including terror.

The devastating news that poured in this morning as I woke of a terror attack on innocent civilians, it was difficult to believe. As I write this, forty nine people have been killed and more than 20 others seriously injured in shootings at two mosques in the Christchurch, New Zealand.

The man identified himself as 28-year-old Australian-born Brenton Tarrant.

Dressed in a military-style, camouflage outfit, and carrying an automatic rifle he started shooting people in the Al Noor mosque after 1.40pm local time (12.40am GMT) and live streamed the whole thing on Facebook Live, while thousands watched.

Around the world, the footage has now been re-watched over and over, with major news outlets in the UK including the Mirror and Daily Mail re-posting the footage for readers to see.

How was this allowed to happen, stream, and stay online? Well, let’s go back to a quote from Mark Zuckerberg, in 2007.

“We don’t check what people say before they say it, and frankly, I don’t think society should want us to. Freedom means you don’t have to ask for permission first, and by default, you can say what you want.”

Is that right? Is that really the excuse of Facebook for literally streaming, and promoting terrorism via giving it a place on it’s platform. Well, it isn’t really good enough.

As a global platform – Facebook has a right to protect citizens, it has a right, and has the moral obligation to block offensive, dangerous and life-ending content.

“New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we removed both the shooter’s Facebook account and the video,” said Mia Garlick, a Facebook representative in New Zealand.

“We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware. We will continue working directly with New Zealand Police as their response and investigation continues.

“Our hearts go out to the victims, their families and the community affected by this horrendous act.” So… Facebook IS checking what people are saying? Well it seems as and when they will, but not when we need them to protect us.

And it isn’t just Facebook. Days before the attack, the Twitter handle @brentontarrant tweeted pictures of one of the guns apparently later used in the attacks.

It was covered in white lettering, featuring the names of others who had committed race- or religion-based killings. It included the phrase: “Here’s Your Migration Compact.”

The photo uploaded appeared on the suspect’s since-deleted Twitter account. Yes, since deleted… But not picked up on Wednesday, Thursday or Friday.

YouTube has also been quick to remove content today adding “Our hearts are broken over today’s terrible tragedy in New Zealand. Please know we are working vigilantly to remove any violent footage.”

But how can the footage go live, stay live and be posted in the first place? Content like this should not have a place on social media, the same way it shouldn’t have a place in society.

If we really want platforms that open us positive debate, conversation and content… we have to be able to control what this content looks like.

Algorithms aren’t clever enough to pick this up currently, while actual take down requests are managed by real people – and not enough people.

If Facebook, and other social platforms, are to take these events seriously they need to put their huge sums of money where their mouths are… and provide real change.

The events of today add to a growing trend of social media-promoted terror. Action must be taken, and we must rally to see change.