Live facial recognition is causing concern among observers and civil rights activists. Photo / 123RF
On the last day of January, few of the shoppers and office workers who hurried through Romford town centre in east London, scarves pulled tight against the chill, realised they were guinea pigs in a police experiment.
The officers sitting inside a parked van nearby were watching them on screens,using a new technology that the police hope will radically reduce crime in London — live facial recognition. Cameras stationed near Romford train station picked up every face walking past, and matched it to a police watchlist of wanted criminals. Successful matches would result in immediate arrest.
For all the potential to fight crime, however, the trial quickly stumbled into the thorny issues that surround the technology. A bearded man in a blue baseball cap approached the surveilled area, with his grey jumper pulled up to cover his face. He had just been informed by a bystander that the police were testing facial recognition in the area and did not want to participate.
The police demanded that he comply and scanned his face with a facial recognition tool on a mobile phone. Although his face did not match that of any known criminals, a verbal altercation ensued, which resulted in the man being fined £90 ($166) for telling an officer to "piss off". The entire incident was caught on camera by journalists.
"The fact that he's walked past clearly masking his face from recognition. It gives us grounds to stop him," an officer says, defending his actions.
The incident — one of four arrests of people avoiding the cameras in Romford that day — is one of the reasons that live facial recognition is causing such acute concern among observers and civil rights activists. Given that the technology is such an overt form of surveillance, many believe that explicit consent of citizens is fundamental — something the Romford man never gave.
"When people get stopped and searched in the street, or fined for avoiding cameras, when they don't consent to being observed by cameras, that is a problem," says Peter Fussey, a criminologist at the University of Essex who was present in Romford, as an independent police-appointed monitor.
"The most important thing in research ethics, above all else is . . . to be absolutely sure people consent to being part of that research . . . Yet what happened in these trials is that if people did not engage with it, police would intervene, stop them and search them."
London is now at the forefront of a battle over the use of facial recognition by the authorities that is escalating across many democratic countries. As the technology has become commercially available in recent years, via companies like Apple and Facebook, the biggest uptake has been in countries with authoritarian political systems — most notably in China, which uses facial recognition as part of its extensive and highly intrusive surveillance of Muslim Uighurs in Xinjiang province that has been denounced by human rights groups.
As police departments in democratic countries begin to investigate the technology, London has become one of the main test grounds because of the large network of CCTV cameras that already operate in the city. The Romford operation was one of 10 such events around London carried out by the Met police over a period of three years, including twice at Notting Hill Carnival.
Over the course of the trials, police planned to gather evidence about the accuracy and bias of the system and to assess whether the use of the dragnet technology could be justified by its potential benefits — preventing or solving major acts of violent crime.
The use of face recognition by two forces — London's Metropolitan Police and the South Wales force — has sparked a national debate about where people will draw the line to protect their right to privacy. The discussion centres on whether there is any legal basis to use live facial-recognition on the general population, and whether blanket use of the technology fundamentally undermines the rights of citizens. It comes on the heels of US cities, such as San Francisco and Oakland, choosing to temporarily ban facial recognition use by public bodies until regulations are in place.
Many of these issues could come to a head in a legal case in the UK. In May, Ed Bridges who lives in Cardiff, the Welsh capital, brought one of the first legal challenges to police use of facial recognition on the grounds that it is a breach of the Human Rights Act 1998. The outcome of the case could set a precedent around the world, from the US to India and Australia where facial recognition is being quietly tested.
"We are not aware of anywhere live facial recognition is being used for general public surveillance, except in China," says Silkie Carlo, executive director of Big Brother Watch, a civil rights campaign organisation that has brought a separate legal challenge against the Met Police's use of the technology in London.
"It's really alarming for Britain to go down this path and set this precedent not only for other democracies, but certainly for less liberal states. It's being used to track ethnic minorities in China, the possibilities are chilling."
London is an obvious test bed for visual surveillance technologies. An estimated 420,000 CCTV cameras operate in and around the city, making it the second-most monitored city in the world after Beijing, with its 470,000 cameras, according to a report by the Brookings Institution. (Washington DC, in third place, has just 30,000).
Many were put in place in the early 1990s in response to IRA bombings in the city, followed by waves of installations after the September 11 and London Underground terrorist attacks, and the 2012 Olympics.
For years, the cameras dotted around the city were "dumb" devices, peepholes that did not know what they were looking at. However, advances in artificial intelligence, along with the dropping cost of the cameras themselves, have transformed the business of visual surveillance. Machine learning algorithms trained to recognise specific people, objects or strange behaviours have supercharged these cameras, allowing them to effectively "see".
Over the next five years, the number of so-called smart cameras in public and private spaces, from schools to public toilets to hospitals, is expected to increase exponentially across London according to a report by Tony Porter, the UK's surveillance camera commissioner, creating a smart city that will itself become the eyes and ears of an overburdened law enforcement system.
"AI . . . could analyse thousands of video-feeds to track and alert authorities of anomalies," writes Esther Colwill, Accenture's global lead on media and technology, along with a team of colleagues in a report on AI surveillance. "If enabled, cities could crowdsource commercial and residential security system data . . . to get a real-time picture of potential criminal activity."
Elements of this future are already being used by public bodies in London. Transport for London have used AI to analyse footage from cameras in areas such as Liverpool Street and Mile End, to spot unusual behaviours such as lingering pedestrians or suspicious baggage. Local councils, such as Newham, have trialled smart CCTV that sends automatic alerts to officials about events such as crowd build-up or suspicious objects.
In NHS hospitals, millions of patients are exposed to "ever increasing surveillance technology from drones and body-worn video to automated facial recognition", Mr Porter said in the January report.
Smart CCTV is also being pioneered by the private sector. Convenience stores like Budgens, and supermarkets including Tesco, Sainsbury's and Marks and Spencer all have cameras that are already, or soon to be, capable of facial recognition, used for applications ranging from crime prevention to estimating the age of those buying alcohol or cigarettes.
Yoti, a British technology start-up, is rolling out its facial analysis software in over 25,000 convenience stores in the next four months to estimate the age of customers; while another London start-up, Facewatch, says its software, which can recognise known criminals, has been trialled by a number of high-street retailers in the past two years, and will soon be included in 550 stores across London.
Facewatch has been in talks to sign data-sharing deals with the Metropolitan Police and the City of London police.
Shaun Moore, chief executive of US-based facial recognition company Trueface.ai that provides its technology to UK casinos, says London is an advanced market. "The cameras are already there and have been there for decades. They were put in for safety and security, so there was never a big uproar about it," he says. Like Facewatch and Yoti, Trueface does not manufacture cameras, but provides its software to businesses looking to upgrade existing CCTV. The company is "putting our solution on existing infrastructure," explains Mr Moore.
Increasingly, the lines are blurring between the use of facial recognition technology by private firms and the public sector. Surveillance camera systems in public places are operated by the private sector, who give law enforcement free access to their footage.
The police's own facial recognition systems are built by commercial organisations, which can raise other issues. Japanese technology company NEC provides cameras to the Metropolitan and South Wales Police.
Hannah Couchman at Liberty, a non-profit organisation that is supporting the case against the South Wales Police, says that any examination of the technology requires access to training data and algorithms, but the company sees that information as a trade secret. "That overlap between government and private companies leads to a lack of transparency that is inevitable," she says.
NEC declined to comment.
Some experts believe the close collaboration between public and private sectors is a growing problem because there is currently no ethical or regulatory framework for private use of surveillance technologies.
"All these companies have customers but they can't share who their clients are," said Stephanie Hare, a campaigner and researcher. "The private sector is where we have the greatest ignorance. We have no data about how companies are using it, nobody has oversight, it's a total free for all."
One reason for the legal vacuum is that UK laws governing the use of biometrics, including our facial data, have not been updated since 2012, and focus primarily on DNA and fingerprints.
"I think it's quite urgent that we enact new legislation to cover facial images, because the technology is developing apace and the police are already exploring it," says Paul Wiles, the UK's independently appointed biometrics commissioner. "Biometrics is emerging in both the public and private sector, and the important question is who will share what with whom, and who decides."
Mr Fussey, from the University of Essex, agrees, saying the current laws around facial recognition were "completely inadequate". "Parliament is so choked up with Brexit there is no appetite for law changes, but there isn't a legal basis. You're asking police to trial it to keep the public safe, but with no national leadership or guidance. It is unfair on the public and on the police."
MPs on the science and technology committee last week urged the Home Office to impose a moratorium on all facial recognition trials until regulations can be established. The call has been backed by independent experts including AI researchers at the Ada Lovelace Institute; the biometrics commissioner; and the UK's data protection authority, which said the use of facial recognition in public spaces represented "a real step change in the way law-abiding people are monitored as they go about their daily lives".
However, the police believe the potential benefits are significant. "Live facial recognition technology has the potential to help our officers locate criminals who are wanted for serious and violent offences, such as knife and gun crime, and the sexual exploitation of children," said deputy assistant commissioner Duncan Ball of the Met Police. "The public would expect the Met Police to use all available and proportionate means to catch violent offenders and it is right that we trial emerging technology that could help us to do so." Before the end of the year, the Welsh courts will deliver a judgment on the case brought by Ed Bridges against the South Wales Police for scanning his own face twice, including during a protest against the Cardiff Arms Fair.
The judgment will set a historic legal precedent for the UK's use of facial recognition. In his crowdfunding appeal for the case, Mr Bridges laid out his view of the high stakes. "The inevitable result is that people will change their behaviour and feel scared to protest or express themselves freely — in short, we'll be less free."