Facebook responds
A Facebook New Zealand spokesman said, "We continue to automatically detect and prevent new uploads of this content on our platforms, using a database of more than 900 visually unique versions of this video. When we identify isolated instances of newly edited versions of the video being uploaded, we take it down and add it to our database to prevent future uploads of the same version being shared."
"One of the challenges we faced in the days after the Christchurch attack was a proliferation of many different variants of the video of the attack. People - not always intentionally - shared edited versions of the video, which made it hard for our systems to detect.
"Although we deployed a number of techniques to eventually find these variants, including video and audio matching technology, we realised that this is an area where we need to invest in further research."
"That's why we announced last week that we're partnering with The University of Maryland, Cornell University and The University of California, Berkeley on a US$7.5 million research piece to identify new techniques to detect manipulated media and distinguish between unwitting posters and adversaries who intentionally manipulate videos and photographs.
"This work will be critical for our broader efforts against manipulated media, including deep fakes - videos intentionally manipulated to depict events that never occurred. We hope it will also help us to more effectively fight organized bad actors who try to outwit our systems as we saw happen after the Christchurch attack."
Resisting change
The alleged gunman streamed his 17-minute, March 15 attacks on Facebook Live, and it took the social network an hour to take the clip down after its automated safeguards failed and it was ultimately alerted to the video's presence by NZ law enforcement.
Facebook says it has beefed up its filters since the attacks, and blocked more than 1.5 million attempts to upload the clip.
However, every few days since March 15, Feinberg has been able to locate copies of the clip on Facebook, Facebook-owned Instagram and, at times, Google-owned YouTube.
Facebook has so far resisted putting a slight delay on Facebook Live, or placing any universal restrictions on the service (such as YouTube's new requirement for a mobile user to have at least 1000 followers before they are allowed to livestream)
But it has introduced a new policy that will see users who break "certain rules" including its dangerous individuals or groups policy, potentially barred from using the service.
The recent Christchurch Call summit is Paris, which sought ways to eliminate violent extremist content on social media, was called a good start by most commentators.
However, the refusal of the US to support the initiative - with the White House citing free-speech concerns - undermined its modest proposals and kept the pressure off Facebook.
PM: Chch Call research will address problem
A spokesman for Prime Minister Jacinda Ardern said "the clip is being cut and edited in ways that see it slip through the cracks of the social network's systems. That is what the Christchurch Call commitments are trying to solve. The shared research will address these issues."
The Paris summit saw tech giants Amazon, Facebook, Google, Microsoft and Twitter agree to collaborate with the 17 participating governments in research to prevent and remove violent, extremist content.
Facebook is chipping in US$7.5m for its aforementioned collab with UC Berkeley and Cornell.
The spokesman agreed that was not enough money, but he noted that all the tech companies who attended would be contributing funds to the effort.
Facebook is so far the only company to quantify its contribution, he said.