The iPhone X has moved to facial recognition. Photo / AP
By Craig Timberg
A whiff of dystopian creepiness has long wafted in the air whenever facial recognition has come up.
Books, movies and television shows have portrayed the technology as mainly a tool of surveillance and social control - aimed by unseen others at you, for their purposes, not your own.
Apple sought to reverse that equation this week with the long-anticipated release of its 10th-anniversary smartphone, the iPhone X.
It replaces the fingerprint sensor previous generations used for unlocking a user's device with facial recognition technology, while still keeping others from unlocking the phone without the user's knowledge.
All users have to do, Apple said at the annual September event dedicated to touting its latest product updates, is look at the iPhone X, which recognises you as the registered user - even if you are wearing glasses or a hat or are sporting a new beard.
Though not entirely new - several Android smartphones do something similar already - the technology remains novel.
Apple's embrace of it could mark a tipping point in the adoption of facial recognition technology across new areas of our lives - as we shop or communicate with friends, and, eventually, as we enter buildings or perhaps turn on our vehicles with a glance rather than a twist of the key.
The big danger with facial recognition is that we are targeted everywhere we go and in everything we do.
Many forms of surveillance - cellphone location tracking, social media analytics and the CIA's reported ability to remotely activate the microphone on an individual's smart TV - were born of such popular consumer advances.
Only later, typically through leaked documents and investigative reports, did it become clear how popular technologies were turned on their users.
"The big danger with facial recognition is that we are targeted everywhere we go and in everything we do," said Jay Stanley, a senior policy analyst with the ACLU's Speech, Privacy and Technology Project. "The acceptable uses could soften up the terrain for less acceptable uses."
The potential for widely deployed facial recognition systems has particularly concerned privacy experts, who have warned about a future in which our faces and other biometrics are used to track our every movement, our political activity, our religious lives and even our romantic encounters.
Recent research at Stanford, meanwhile, contends that a range of private facts, including an individual's sexual orientation, could be read through sophisticated analyses of facial images with the help of artificial intelligence.
"We have only one face," said Clare Garvie, an associate at Georgetown University's Centre on Privacy & Technology and an author of the Perpetual Line-Up, a 2016 report on facial recognition databases collected by governments.
"The more comfortable we become with facial recognition, the more complacent we may become."
What Apple introduced this week was a version of facial recognition technology that iPhone X owners are supposed to use on themselves, for their own purposes and only when they want to. They can always type a numeric passcode instead.
Such caveats have earned the company cautious praise from some privacy experts. They noted that the iPhone X will keep its facial analysis data secure on the device rather than transmitting it across the internet (where it could potentially be intercepted) or collecting it in a database that might allow hackers, spies or law enforcement agencies to gain access to facial records en masse.
The Android devices that use facial recognition also keep the data on the device, although hackers have demonstrated that some of these systems can be tricked by photographs of users - something Apple says cannot happen with the iPhone X.
"I don't think we should reflexively reject facial recognition. The question should be, by what means and for whose benefit?" said Marc Rotenberg, executive director of the Electronic Privacy Information Centre.
Half of US adults already have their images in some federal, state or local facial recognition system through a combination of databases of people who have been arrested or convicted of crimes, along with ledgers of people who hold driver's licences, passports and visas, the 2016 Georgetown report found.
Privacy experts have fought to curb the expansion of such databases and to limit how and when the databases are used.
They have also sought to raise awareness of the huge commercial databases kept by Facebook and Google, both of which in some circumstances use facial recognition technology to identify people depicted in photos users upload.
Also slowing the spread of the technology has been the daunting technical challenges of accurately analysing faces in anything less than optimal circumstances.
Apple's system appears to solve this. Owners of the iPhone X are supposed to willingly "enroll" their faces from arm's length, turning their heads so facial contours are captured more fully. Opening the device later takes only a brief glance.
The chance that a random person in the population could look at your iPhone X and unlock it with their face is about one in a million.
The facial recognition system, dubbed the TrueDepth camera system, includes a front-facing camera, a proximity sensor, an infrared camera and a dot projector that beams more than 30,000 invisible infrared dots onto a user's face to take measurements.
The device then combines all the available data to create what Philip Schiller, Apple's senior vice president of worldwide marketing, called "a mathematical model of your face".
"The chance that a random person in the population could look at your iPhone X and unlock it with their face is about one in a million," Schiller said, presenting the new device at Apple's glitzy new Steve Jobs Theatre in Cupertino, California.
There also is the question about what power law enforcement agencies have to gain access to data in devices. The Supreme Court ruled in 2014 that authorities require a search warrant to seize and attempt to examine a smartphone.
It would take a separate court order to require a device's owner to unlock it for police, said Nate Cardozo, a senior staff attorney at the Electronic Frontier Foundation, a civil liberties group in San Francisco.
Cardozo expressed less concern than others. "People seem to understand that on a gut level when they use biometrics for their own purposes. That's very different than being part of a database that can be used against them."