Growing anger at social media companies after the Christchurch terror attack is galvanising, with moves towards a global crackdown expected to be revealed by the Government in the coming weeks.
Steps towards global regulation of social media companies to rein in harmful content looks likely, with the Government set to take a lead role in a global initiative, the Herald has learned.
The will of governments to work together to tackle the potentially harmful impacts of social media would have only grown stronger in the wake of the terror attacks in Sri Lanka, where Facebook and Instagram were temporarily shut down in that country to stop the spread of false news reports.
Following the Christchurch terror attack, Prime Minister Jacinda Ardern has been working towards a global co-ordinated response that would make the likes of Facebook, YouTube and Twitter more responsible for the content they host.
The Government has been talking to global partners and the Herald understands an announcement is due soon.
A spokeswoman for the Prime Minister would not comment on the matter last night.
Currently multinational social media companies have to comply with New Zealand law, but they also have an out-clause - called the safe harbour provisions - that means they may not be legally liable for what users publish on their sites, though these were not used in relation to the livestream video of the massacre in Christchurch.
Other countries, including Australia, are taking a more hardline approach that puts more onus on these companies to block harmful content, but the Government has decided a global response would be more effective, given the companies' global reach.
Facebook has faced a barrage of criticism for what many see as its failure to immediately take down the livestream and minimise its spread; Facebook removed 1.5 million videos of the attack within 24 hours.
Facebook took down the video 12 minutes after the livestream had ended following a notice from police, not from Facebook users or its own algorithms, which reportedly failed because the footage did not have enough gore.
Ardern has said this wasn't good enough, saying shortly after the Christchurch terror attack: "We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published."
Privacy Commissioner John Edwards has also been scathing, calling Facebook "morally bankrupt" and saying it should take immediate action to make its services safe.
Fifty people were killed in the twin mosque attacks on March 15 and 39 others were wounded.
Netsafe chief executive Martin Cocker said that existing laws and protections were not enough to stop the online proliferation of the gunman's video.
He doubted that changing any New Zealand laws would be effective, and echoed Ardern in saying that a global solution was ideal.
"I don't think there's any regulation that New Zealand could write that would have made any practical difference on March 15," Cocker said.
"If we start to get consistent responses in terms of regulation from countries around the world where Facebook and Google are popular, then New Zealand can join that international response and we're more likely to see those companies meet the legal standard being set.
"The problem is that it's hugely complicated. What are the chances that we get all the countries in the world to agree on anything when it comes to regulation?"
Facebook boss Mark Zuckerberg, in an open letter, has previously called on governments to regulate online content hosts.
Cocker added that nothing was stopping Facebook from self-imposing stricter standards, but there was a "fair way to go" before its algorithms became advanced enough to recognise content like the gunman's video.
"The internet is full of harm and it's full of technology that can be used to cause harm. It's not going to be easy to solve these things.
"You could close down Facebook, but you wouldn't get rid of livestreaming problems. You could insist on having a duty of care [as proposed in the UK], but that wouldn't get around jurisdictional problems. It is a wicked problem and it will take some time to solve."
The UK is currently considering a white paper on online harms that proposes a "statutory duty of care" for online content hosts.
Rules would be set up and enforced by an independent regulator, which would demand illegal content to be blocked within "an expedient timeframe". Failure to comply could lead to substantial fines or even shutting down the service.
Released earlier this month, the white paper's ministerial forward specifically mentions Christchurch: "The tragic recent events in New Zealand show just how quickly horrific terrorist and extremist content can spread online."
In Australia a law was recently passed that requires hosting services to "remove abhorrent violent material expeditiously" or face up to three years' jail or fines in the millions of dollars.
Germany also has a law that gives social media companies an hour to remove "manifestly unlawful" posts such as hate speech, or face a fine up to 50 million Euros.
And the European Union is considering regulations that would give social media platforms an hour to remove or disable online terrorist content.
While Ardern has ruled out a model such as Australia's, changes to New Zealand law could still happen following the current review of hate speech.
In New Zealand multiple laws - including the Harmful Digital Communications Act, the Human Rights Act, and the Crimes Act - dictate what can and cannot be published on social media platforms.
Even though Facebook was criticised for how long it took to respond to the gunman's video, the safe harbour provisions in the Harmful Digital Communications Act mean that it could have taken days and still would have been within the law.
The provisions provide freedom from any civil or criminal liability - such as copyright infringement, defamation, or inciting racial disharmony - as long as the online content host takes reasonable steps once a complaint is received.
Those steps include a 48-hour window after receiving a complaint to seek a response from the content's author, and then another 48 hours to consider those responses before deciding whether the material should be published.
International regulation could lead to that response-time window narrowing.
Cocker pointed out that Facebook acted to block the video rather than use the safe harbour provisions, nor is he aware of any instances in which the provisions had been used.
But he said they were important because they recognised that social media platforms were not publishers in the traditional sense, such as a newspaper that controls its published content.
"The provisions enable online content hosts to be properly engaged in safety processes, but without unfairly penalising them for doing so.
"It gives them a window to ascertain that harmful content is on their site and take action, and during that window, they are not legally liable."
NZ looks to global action to crack down on social media firms
• Facebook was heavily criticised for failing to block the gunman's video quickly enough on March 15
• In NZ, social media companies have safe harbour provisions (though this wasn't used on March 15) that offers freedom from civil or criminal liability if they follow a certain process when dealing with complaints. This means that days can pass before content needs to be taken down.
• Countries like Australia and Germany have taken a harder line on social media platforms, with Germany's laws requiring terrorist content to be removed within an hour
• NZ wants social media platforms to be more responsible for the content they host, but instead of changing domestic laws, is pursuing a global approach.