A new dedicated team will target violent extremist content online and prosecute those uploading it, filling a gap that was laid bare following the March 15 terrorist attack.
Prime Minister Jacinda Ardern and Internal Affairs Minister Tracey Martin announced $17 million over four years for the new team during a press conference in Parliament today.
Ardern has previously talked about how the March 15 attack exposed the flaws in New Zealand's domestic legislation and capability.
While the voluntary guidelines in the Christchurch Call would have a bigger and more immediate global impact, she has said that New Zealand laws and capability also needed to be changed.
Previously the 13 investigators in the Department of Internal Affairs' censorship compliance unit has focused on child abuse and exploitation images, and only looked at online violent extremism at the expense of this work.
The unit works with police, customs and international law enforcement partners to block objectionable material, which is illegal to possess or distribute.
It has lacked a dedicated violent extremism team, but will now have about 17 new full-time employees to fill that void.
The team will be tasked with finding such content and stopping it from spreading, as well as educating users to prevent such content from being uploaded in the first place.
The new team will also work with ISPs and platforms to take down and block objectionable material.
"The changes mean we can target this material in a similar way to how we target child sexual exploitation material, by working quickly with online content hosts to remove it as quickly as possible," Martin said in a statement.
The changes will likely lead to more prosecutions for those who actively engage in uploading and spreading content like the March 15 video or manifesto.
The lack of a quick response to March 15 was a flaw that Ardern has already highlighted, saying there was "quite a lag" before the video footage and manifesto were made illegal.
It took three days for the video footage and the alleged gunman's manifesto to be deemed objectionable, though the Chief Censor's response was hastened to one day for last week's footage of the attack on a German synagogue.
There are currently no plans to review the definition of objectionable, but the Government is also reviewing hate speech laws, and workshops will soon take place to discuss gaps in New Zealand laws and regulations.
To date it has mostly been incumbent on global social media platforms to take down violent content voluntarily in line with their own guidelines, rather than being compelled to by New Zealand law.
Nor is there any New Zealand law with a dedicated take-down time for objectionable online material, or one that imposes on social media giants a statutory duty of care, which would make them liable for the content they host.
There is a 48-hour "safe-harbour provision" in the Harmful Digital Communications Act that can apply to online content, but it is not specifically about violent content and is a process for dealing with complaints primarily about online bullying.
Law changes to fill the gaps in New Zealand law are expected before the end of this parliamentary term.
Ardern said it was important to maintain civil liberties, including freedom of speech, but it was also important to protect New Zealanders from objectionable content.
"We need to meet that challenge as a country and as a global community."
Associate Professor David Parry, head of computer science at AUT, welcomed the accouncement.
"A boost to the immediate response team is to be welcomed, along with reducing the time to make decisions. Clarity on the law is also valuable, in particular to ensure that naïve adolescents are not treated more harshly than active extremists simply because the former are easier to detect.
"Regulation is going to be difficult, partly because it is effectively impossible to stop people accessing overseas sites and partly because the self-publishing model is key to the existence of many of these companies."