The video, created by UK artists Bill Posters and Daniel Howe in partnership with advertising company Canny, shows the Facebook boss sitting at a desk, seemingly talking to a US news network.
"Imagine this for a second: One man, with total control of billions of people's stolen data, all their secrets, their lives, their futures," Zuckerberg's likeness says, in the video. "I owe it all to Spectre. Spectre showed me that whoever controls the data, controls the future."
It's a neat concept, but poorly synced audio and hammy dialogue provide obvious clues that it's a fake - meaning it was probably a pretty easy call for Facebook to leave the clip online.
A NZ Law Foundation-funded study released on May 21 highlights the risk of "deep fakes" on social media - which co-author Tom Barraclough said are increasingly sophisticated.
But Barraclough and his co-author Curtis Barnes also warned about knee-jerk reactions to the phenomenon.
Specific legislation targeting deep fakes, such as the Malicious Deep Fake Prohibition Act introduced to the US Senate last year, risks violating human rights, Barraclough told the Herald.
He argued NZ already has multiple laws and guidelines that already cater to the risk - primarily the Crimes Act, which covers when deception is used for gain, the Harmful Digital Communications Act, which covers when it's used for malice and the Privacy Act because "the wrong personal information is still personal information".
Barraclough said most of the time, "deep fake" technology is used by playfully, or as satire.
Exhibit A for his human expression argument is local satirist Tom Sainsbury, who uses a face-swapping app for his satirical political clips on Facebook - often targeting Simon Bridges or Paula Bennett.
Deep fake technology can be convincing - check out this clip of "Barack Obama" (warning language), or adept - watch this effort by US company DeepTrace that puts Steve Buscemi's face onto actress Jennifer Lawrence's body to show off its chops.
The websites whichfaceisreal.com and thispersondoesnotexist.com also showcase deep fake technology.
But its ever-evolving nature makes it difficult for the likes of Facebook to trace and police.
"We have in-built biological trust in the data derived by our eyes and ears," he says.
But he and Barnes also credit the general public with some nous.
"People understand the limits to which what they see and hear through video and audio recording is only a partial representation of reality," he says.
The biggest danger the researcher says, "is probably over-scepticism".
If something like the Jami-Lee Ross audio recordings of Simon Bridges was released in future, people might not believe it is true, he says.
He also fears that any legislation targeting deep fakes would hinder the ability of oppressed groups to find a multimedia voice on social media.