Foodstuffs has said its system deleted images automatically and immediately unless the image matched with that in the same store’s record of 'offenders and accomplices'. Photo / Tero Vesalainen, Getty Images, File
Foodstuffs says it’s comfortable using facial recognition technology despite concerns raised by some consumer and civil liberties groups.
Consumer NZ said the Foodstuffs North Island trial raised issues about privacy, accuracy and racial discrimination.
It said the trial gathered no information on the ethnicity of individuals mistakenly identified by thetechnology and the way statistics on mistaken identity were reported by the retailer.
Some studies abroad have raised questions about whether facial recognition (FR) was less discerning when identifying darker-skinned people.
But Foodstuffs, the New World and Pak’nSave operator, said FR was just one part of the solutions it used to keep staff and customers safe.
“The software we’re using was designed and developed in Australia, by an Australian organisation. They use multiple datasets from around the world including locally acquired data from Polynesian, Indigenous Australian, Asian, African American and American groups,” Foodstuffs added.
“Where the system detects a match, two of our specially trained team members need to then agree it’s a match before an alert is acted on. Our people always have the final say, as is the case now in stores that are not using FR.”
A US Commission on Civil Rights study in September on FR found likely “false positives for certain demographic groups, specifically Black people, people of East Asian descent, women, and older adults than over the entire population”.
Foodstuffs said its approach about deciding on matches was consistent with that US civil rights study, with FR being a lead generator and human intervention required as a follow-up.
On images of minors, Foodstuffs said: “The most recidivous minors are well known to our store teams. No minors (under the age of 18) will be entered into any FR system. If there’s any doubt regarding an offender’s age, our teams are trained to err on the side of caution and won’t enrol them.”
A spokesman added: “During the trial, the store datasets of enrolled offenders were reviewed as a part of the trial evaluation and no minors were identified.”
On whether FR carried risks of racial bias, Foodstuffs said when the system detected a match, two specially trained team members would still need to agree that a match was made before the FR system was acted on.
It said evaluation and research firm Scarlatti helped design the trial and were independently evaluating the trial.
It said the 25 stores in the trial would keep using FR with the same privacy protocols and processes used during the trial.
“The list of stores using FR will continue to be publicly available on our website and the stores will continue to have signage notifying of FR use at their entrances.”
New Zealand Council for Civil Liberties chair Thomas Beagle said even if FR was accurate regarding avoiding false positives for ethnic minorities, it was still problematic.
“We’re actually highly concerned about it ... It captures who they are, where they are at a particular time, who they’re with.”
He said in contrast to old-fashioned CCTV, facial recognition captured far more data about people.
“Data is just so much easier to use, manipulate and abuse.”
Professor Alex Sims, a University of Auckland commercial law and blockchain expert, said FR systems broadly speaking carried risks of getting hacked, so quick deletion of footage from systems was preferable to long retention periods.
Foodstuffs has said its system deleted images automatically and immediately unless the image matched with an image in the same store’s FR system record of “offenders and accomplices”.
Sims said the lines between CCTV, facial recognition and artificial intelligence were increasingly blurred.
“People are sleepwalking into a lot of this.”
Sims said FR broadly raised issues around informed consent.
She said practically, most people who drove to a store, then left the car, stepped inside the shop and saw a sign disclosing the use of FR would probably not turn away.
But if a company disclosed FR location with more prominent signage or on its website, it might be argued consumers had more choice.
“There’s a growing creep of surveillance,” she added.
Privacy Commissioner Michael Webster in April started his inquiry into Foodstuffs North Island’s FR trial.
“There is no known other current use of facial recognition technology in the retail sector in New Zealand,” Webster said at the time.
“Its use generally across New Zealand to scan and identify an individual in real-time and compare them against a database of faces is rare.”
Also in April, the Commissioner unveiled a draft for a new Biometric Processing Privacy Code for public consultation.
Law firm MinterEllisonRuddWatts said the code aimed to develop rules on how businesses and organisations can collect, use, store, and disclose biometric information.
That included physiological and behavioural biometrics and biometric samples, including fingerprints, facial recognition, and voice recordings.