Police are using facial recognition technology without the formal framework its own advisers urged it to adopt to avoid causing “significant harm” and being seen it is “cavalier about its oversight of technology”.
The lack of oversight systems for facial technology was “not trivial stuff”, says a civil liberties campaigner,who agreed with the police assessment that it appeared “cavalier”.
While facial technology is not used to scan live CCTV camera feeds, police have introduced the new tool to match unknown suspects and people of interest against the databases of photographs it holds - including people previously arrested and firearms licence holders for firearms-related offending.
The upgraded technology can recognise facial features and also scars and tattoos.
A Privacy Impact Assessment ahead of the technology being introduced a year ago offered stark warnings about the risk of introducing the new technology and recommended a series of steps be taken, including auditing how the system was being used.
Police headquarters has confirmed to the Herald that the technology has been brought into use and that the steps highlighted in the PIA had yet to be implemented.
In two separate instances, police officers falsely recorded vehicles as stolen so as to track the people to which they were linked. The misuse came after a police review of automatic number plate recognition technology that urged headquarters to build public confidence in the technology developing accountable and transparent systems.
The discovery - revealed by the Herald last year - exposed the lack of those systems and led to the Minister of Police at the time, Chris Hipkins, calling for an audit into its use.
The audit was completed last year, provided to Hipkins’ successor Stuart Nash in late January and reviewed by the police governance group this month. The results will be released publicly after the Office of the Privacy Commissioner and others are briefed.
New Police Minister Ginny Andersen told the Herald: “I have been advised police are working on developing a technology framework for the assessment and management of new technologies and new uses of existing capabilities.
“I am expecting to be briefed on this shortly. I have the expectation that Police will continue to manage and audit the technologies they use appropriately.”
Police have been alert to the risks involved embracing high-tech solutions to crime fighting with an expert report on facial recognition provided in November 2021 - updated in March 2022 - warned of the potential for “accuracy and bias”.
The report “Facial Recognition Technology: Considerations For Use In Policing” was followed by an updated Privacy Impact Statement in February 2022 which specifically focused on the upgrading of the police’s Image Management System to allow for facial recognition.
The upgrade would allow police to scan the photograph databases it holds when searching for individuals suspected of committing, or connection to, crime. Not only would it scan facial features but was developed to recognise scars and tattoos.
It went on to make seven recommendations which urged police to build a system that limited access to those who would use it, having specific “rules and reporting tools” to make sure activity was “recorded and reportable for audit purposes” that “acts as a deterrent to misuse”.
When asked if police had followed all recommendations, a spokeswoman said: “Yes, police are working to address the recommendations.”
However, when addressing each recommendation, the spokeswoman said the formal audit process intended to limit who could access the system was “being developed” over the five levels of access.
In relation to checks on how the system was being used, she said “ad hoc audits are underway” by the manager of the unit. The audit “capability is being built up” and the results of “ad hoc” audits were “not publicly available”.
NZ Council for Civil Liberties chairman Thomas Beagle said: “To quote the police, they sound unfortunately cavalier.
“It seems ridiculous they are rolling out these systems without setting up the oversight systems they need to monitor them. You shouldn’t be allowed to go live with a system without having the oversight and governance systems in place to monitor that.”
Beagle said it could speak to a disconnect between those who wrote the policy for the systems and those who put the systems in place.
Facial recognition and automatic number plate recognition was “not trivial stuff”, he said.
The PIA urged police to carry out proactive audits to “reduce the potential harm that may arise from misuse” and it should be “designed to detect misuse of data before the misuse causes harm”.
There was also no governance group established - as recommended - although it was planned. The spokeswoman said “discussions over the constitution of such a group are on-going”.
The PIA had said there needed to be “an appropriate governance group that receives regular reports detailing the effectiveness of the system and provides assurance that the operation of the system remained ethical and lawful”.
It said the lack of governance risked a lack of oversight “to ensure that controls remain fit for purpose, that the tool remains lawfully used and that the system continues to provide a benefit to policing and contributes to keeping the public safe”.
“Without ongoing governance oversight it is possible that the system may fail to deliver a safe and defensible service or its use is inadvertently widened beyond the current stated purpose, known as function creep.”
Scrutiny of the system from outside police could see “police … seriously criticised for not establishing governance over the system” which would be “unacceptable … as police might be seen as potentially cavalier about its oversight of technology, an unacceptable rhetoric for a law enforcement agency”.
The headquarters spokeswoman said there was also no communication plan which was recommended to reduce the overall risk. “These will be developed once the governance group are established,” she said.