"It's very disturbing to know that no matter what I'm doing at that moment, someone might be scanning my picture to try to find someone who committed a crime." Photo / Sarah Blesener, New York Times
With little oversight, the N.Y.P.D. has been using powerful surveillance technology on photos of children and teenagers.
The New York Police Department has been loading thousands of arrest photos of children and teenagers into a facial recognition database despite evidence the technology has a higher risk of false matches inyounger faces.
For about four years, internal records show, the department has used the technology to compare crime scene images with its collection of juvenile mug shots, the photos that are taken at an arrest. Most of the photos are of teenagers, largely 13-16 years old, but children as young as 11 have been included.
Elected officials and civil rights groups said the disclosure that the city was deploying a powerful surveillance tool on adolescents — whose privacy seems sacrosanct and whose status is protected in the criminal justice system — was a striking example of the Police Department's ability to adopt advancing technology with little public scrutiny.
Several members of the City Council as well as a range of civil liberties groups said they were unaware of the policy until they were contacted by The New York Times.
Police Department officials defended the decision, saying it was just the latest evolution of a long-standing policing technique: using arrest photos to identify suspects.
"I don't think this is any secret decision that's made behind closed doors," the city's chief of detectives, Dermot F. Shea, said in an interview. "This is just process, and making sure we're doing everything to fight crime."
Other cities have begun to debate whether law enforcement should use facial recognition, which relies on an algorithm to quickly pore through images and suggest matches. In May, San Francisco blocked city agencies, including the police, from using the tool amid unease about potential government abuse. Detroit is facing public resistance to a technology that has been shown to have lower accuracy with people with darker skin.
In New York, the state Education Department recently told the Lockport, New York, school district to delay a plan to use facial recognition on students, citing privacy concerns.
"At the end of the day, it should be banned — no young people," said City Councilman Donovan Richards, D-Queens, who heads the Public Safety Committee, which oversees the Police Department.
The department said its legal bureau had approved using facial recognition on juveniles. The algorithm may suggest a lead, but detectives would not make an arrest based solely on that, Shea said.
Still, facial recognition has not been widely tested on children. Most algorithms are taught to "think" based on adult faces, and there is growing evidence that they do not work as well on children.
The National Institute of Standards and Technology, which is part of the Commerce Department and evaluates facial recognition algorithms for accuracy, recently found the vast majority of more than 100 facial recognition algorithms had a higher rate of mistaken matches among children. The error rate was most pronounced in young children but was also seen in those ages 10-16.
Aging poses another problem: The appearance of children and adolescents can change drastically as bones stretch and shift, altering the underlying facial structure.
"I would use extreme caution in using those algorithms," said Karl Ricanek Jr., a computer science professor and co-founder of the Face Aging Group at the University of North Carolina-Wilmington.
Technology that can match an image of a younger teenager to a recent arrest photo may be less effective at finding the same person even one or two years later, he said.
"The systems do not have the capacity to understand the dynamic changes that occur to a child's face," Ricanek said.
Idemia and DataWorks Plus, the two companies that provide facial recognition software to the Police Department, did not respond to requests for comment.
The New York Police Department can take arrest photos of minors as young as 11 who are charged with a felony, depending on the severity of the charge.
And in many cases, the department keeps the photos for years, making facial recognition comparisons to what may have effectively become outdated images. There are photos of 5,500 individuals in the juvenile database, 4,100 of whom are no longer 16 or younger, the department said. Teenagers 17 and older are considered adults in the criminal justice system.
Police officials declined to provide statistics on how often their facial recognition systems provide false matches, or to explain how they evaluate the system's effectiveness.
"We are comfortable with this technology because it has proved to be a valuable investigative method," Shea said. Facial recognition has helped lead to thousands of arrests of both adults and juveniles, the department has said.
Mayor Bill de Blasio had been aware the department was using the technology on minors, said Freddi Goldstein, a spokeswoman for the mayor.
She said the Police Department followed "strict guidelines" in applying the technology and City Hall monitored the agency's compliance with the policies.
The Times learned details of the department's use of facial recognition by reviewing documents posted online this year by Clare Garvie, a senior associate at the Center on Privacy and Technology at Georgetown Law. Garvie received the documents as part of an open records lawsuit.
It could not be determined whether other large police departments used facial recognition with juveniles because very few have written policies governing the use of the technology, Garvie said.
New York detectives rely on a vast network of surveillance cameras throughout the city to provide images of people believed to have committed a crime. Since 2011, the department has had a dedicated unit of officers who use facial recognition to compare those images against millions of photos, usually mug shots. The software proposes matches, which have led to thousands of arrests, the department said.
By 2013, top police officials were meeting to discuss including juveniles in the program, the documents reviewed by The Times showed.
The documents showed that the juvenile database had been integrated into the system by 2015.
"We have these photos. It makes sense," Shea said in the interview.
State law requires that arrest photos be destroyed if the case is resolved in the juvenile's favor, or if the child is found to have committed only a misdemeanour, rather than a felony. The photos also must be destroyed if a person reaches age 21 without a criminal record.
When children are charged with crimes, the court system usually takes some steps to prevent their acts from defining them in later years. Children who are 16 and younger, for instance, are generally sent to Family Court, where records are not public.
Yet including their photos in a facial recognition database runs the risk that an imperfect algorithm identifies them as possible suspects in later crimes, civil rights advocates said. A mistaken match could lead investigators to focus on the wrong person from the outset, they said.
"It's very disturbing to know that no matter what I'm doing at that moment, someone might be scanning my picture to try to find someone who committed a crime," said Bailey, a 17-year-old Brooklyn girl who had admitted guilt in Family Court to a group attack that happened when she was 14. She said she was present at the attack but did not participate.
Bailey, who asked that she be identified only by her last name because she did not want her juvenile arrest to be public, has not been arrested again and is now a student at John Jay College of Criminal Justice.
Recent studies indicate that people of color, as well as children and women, have a greater risk of misidentification than their counterparts, said Joy Buolamwini, the founder of the Algorithmic Justice League and graduate researcher at the Massachusetts Institute of Technology Media Lab, who has examined how human biases are built into artificial intelligence.
The racial disparities in the juvenile justice system are stark: In New York, black and Latino juveniles were charged with crimes at far higher rates than whites in 2017, the most recent year for which numbers were available. Black juveniles outnumbered white juveniles more than 15 to 1.
"If the facial recognition algorithm has a negative bias toward a black population, that will get magnified more toward children," Ricanek said, adding that in terms of diminished accuracy, "you're now putting yourself in unknown territory."