We already know Big Brother is watching.
Tauranga City owns more than 400 CCTV cameras, a small percentage of which focus on parking. Parking officers can view offences following a complaint and use footage to issue tickets.
Earlier this year, parents at Tauranga Primary School complained after CCTV footage was used to issue citations for parking infringements.
A school newsletter notified parents of the installation of CCTV cameras on 5th and 6th Avenues by Tauranga City Council, along with the traffic infringement fines parents could be charged.
A parent who got a $60 fine for parking behind the kerb told NZME, "We're being treated like a bunch of 15-year-olds ... We have to pick our kids up from school so we have to park somewhere."
Cities throughout Aotearoa use thousands of these cameras. Police say CCTV footage helped solve the murder mystery of British backpacker Grace Millane after her disappearance in 2018.
Earlier this month, an official information request resulted in New Zealand Police releasing information it was trialling controversial facial recognition technology, in addition to other investigative tools such as drones that can send live footage to patrols, a superfast system to spot suspects in CCTV feeds and a cellphone scourer with facial recognition capability, according to RNZ.
NZME reported Commissioner Andrew Coster ordered the stocktake in May only after RNZ exposed that police had trialled an algorithm that searches social media for face matches without telling the Government, the Privacy Commissioner or the public.
Digital tracking expert Dr Andrew Chen, a research fellow at Koi Tū, the centre for informed futures, said the new report lacked critical detail.
"The stocktake shows that there are a whole bunch of different projects that police have been working on, these new technologies that they've been utilising, that we haven't really heard much about in the past," Chen said.
"But I don't do anything to be worried about," you say. "Why should it matter?" On the plus side, facial recognition could catch the thugs who stole tools from your work van; or the rageaholic who punched your son outside a bar.
But lack of regulation - the fact laws lag behind technology - is troubling. It means we don't know how our data will be used and whether we have any recourse if we feel we've been wronged.
Will we get a citation for every parking infraction? Will Artificial Intelligence recognise what we buy and where (without use of a loyalty card), and send our data to corporates so they can find new targets for their products? Would facial recognition be used during any future lockdown to ensure we're not gathering with people outside our bubbles?
Under current legislation, we don't know. We can't rely on the goodwill of the Government, local councils and police to do the right thing. Past experience tells us mistakes will happen, as will abuse.
In September, a mum of two girls at Otumoetai College was appalled to discover Otumoetai College had installed wall-mounted cameras inside student toilets. She claimed the cameras could capture footage from inside the cubicles, though school officials say cameras are only in public areas of the toilets to ensure students' safety.
When it comes to policing, associate law professor at Victoria University of Wellington Nessa Lynch said the use of FR is controversial worldwide. Writing in August in the Spinoff, she said, "Unlike other biometric indicators used in policing, such as DNA and fingerprints, automated collection and matching of facial images is generally not covered by legislation. Facial images may be collected at a distance, without the person's consent or even their knowledge."
Questions have been raised about higher levels of misidentification among minorities. The Washington Post last December reported Asian and African American people were up to 100 times more likely to be misidentified than white men.
Lynch says identity matching happens in myriad ways, from matching existing police databases, other state databases, private sector supplied images or open-source data.
"As my research collaborators have found, the use of live automated FRT in public places has significant implications for privacy rights as well as concerns around a chilling effect on rights to freedom of expression and lawful protest," writes Lynch.
As FR use grows, it's easy to imagine a New Zealand where some of us decide not to turn out for an event or protest for fear our face will be used in ways we don't know about, understand or agree with.
We need laws protecting our likenesses from misuse and giving us redress if we feel our rights have been violated. Big Brother is watching. How are we going to stop him from taking surveillance too far?