A surveillance camera operated by the New Orleans Police Department fixed to a lamppost on Bourbon Street. Photo / William Widmer, The New York Times
Advances in artificial intelligence could supercharge surveillance cameras, allowing footage to be constantly monitored and instantly analysed, the A.C.L.U. warned in a new report.
Businesses and the government have spent years installing millions of surveillance cameras across the United States. Now, that technology is on the verge of getting amajor upgrade, the American Civil Liberties Union warns in a new report.
Advancements in artificial intelligence could supercharge surveillance, allowing camera owners to identify "unusual" behaviour, recognise actions like hugging or kissing, easily seek out embarrassing footage and estimate a person's age or, possibly, even their disposition, the group argues.
"We face the prospect of an army of AI security guards being placed behind those lenses that are actually, in a meaningful way, monitoring us, making decisions about us, scrutinising us," said Jay Stanley, senior policy analyst at the ACLU and the author of the report, which was released Thursday.
The United States is, by various estimates, home to tens of millions of surveillance cameras. While many of those devices have been around for years, it has been widely understood that it would be unfeasible, if not impossible, for each device to be constantly monitored and its footage carefully categorised and documented, Stanley notes in the report, titled The Dawn of Robot Surveillance. Even the Justice Department has said that watching such footage is "boring and mesmerising" and that attention fades after about 20 minutes.
But improvements to technology created to actively monitor such feeds, known by several names including "video analytics," are poised to change that, ensuring that every second of footage can be analyzed.
"It honestly has both benefits and security consequences," said Carl Vondrick, a professor of computer science at Columbia University, where he leads a group focused on computer vision and machine learning.
The ability to constantly analyse and learn from a video feed could help self-driving cars understand their surroundings, retail stores track their products and health professionals monitor their patients, he said. It can also be used to scrutinise the routines and actions of individuals on an enormous scale, the ACLU warns.
In the report, the organisation imagined a handful of dystopian uses for the technology. In one, a politician requests footage of his enemies kissing in public, along with the identities of all involved. In another, a life insurance company offers rates based on how fast people run while exercising. And in another, a sheriff receives a daily list of people who appeared to be intoxicated in public, based on changes to their gait, speech or other patterns.
Analysts have valued the market for video analytics at as much as US$3 billion, with the expectation that it will grow exponentially in the years to come. The important players include smaller businesses as well as household names such as Amazon, Cisco, Honeywell, IBM and Microsoft.
At a recent retail industry conference, IBM showed how its video analytics software could be used to count customers and estimate their ages and loyalty status, all in real time. The software could monitor the length of a line, identify a manager as he walked through a crowd, and flag people loitering outside the store.
Amazon's Rekognition service, launched in 2016, can purportedly identify and track people, recognize celebrities and detect objects and read text. (The company drew criticism for pitching that service to law enforcement.) After employees protested, Google last year said it would not renew a contract with the Pentagon's Project Maven, for which artificial intelligence is used to interpret video and images, potentially to improve the targeting of drone strikes.
What video analytics can do
Video analytics providers boast a range of capabilities, according to the ACLU report, including detection of objects dropped or left behind; analysis of a person's direction, gait or movement; and even identification of attempts to enter a secure area by rushing in as another person enters or exits the space. Some companies say their services can discern demographic information or identify clothes and other objects, too.
Software is also being trained to identify a wide range of activities, such as using a phone, shaking hands, punching something, drinking beer and walking toward or away from an object. (Amazon claims that Rekognition can already identify some such actions, including "blowing out a candle" and "extinguishing fire.")
One area of research that the ACLU described with particular concern is the movement to train software on "anomaly detection," which can single out an individual for unusual, atypical or deviant behavior. Another is emotion recognition, which promises to discern a person's mood, though there is little evidence that emotions are universal or can be determined by facial movements alone.
As the technology improves, users will be able to search videos by keyword, surfacing results for precise queries like "red car" or "man wearing hoodie," a capability that already exists for images stored on Google Photos and Apple Photos.
The associated threat
The spread of such technology has a number of dangerous implications, the ACLU warns.
First, algorithms can be trained on biased data sets, leading to discriminatory results. The well-documented racial shortcomings of facial recognition technology, for example, have been linked to training data that skews heavily white and male.
Video analytics software is often trained on publicly available footage, such as YouTube videos, but there may be bias in the kinds of people who post them or in what such videos show.
"There are reasons to fear this technology when it works, and there are reasons to fear this technology when it doesn't work," Stanley said.
The use of video analytics may also have a chilling effect on society, the ACLU warned. If individuals feel that their every movement is monitored, they may alter their behavior. It may also lead to over-enforcement of smaller crimes, a practice that has disproportionately affected minorities or other disadvantaged groups.
"We could find ourselves subject to constant petty harassment and the ignoring of common sense extenuating circumstances," the report warns.
And then there is the potential for abuse. Those who control such systems would wield great power. Without proper regulations, they could use it to nefarious ends, the group warned.
What can be done
To prevent the worst outcomes, the ACLU offered a range of recommendations governing the use of video analytics in the public and private sectors.
No governmental entity should be allowed to deploy video analytics without legislative approval, public notification and a review of a system's effects on civil rights, it said. Individuals should know what kind of information is recorded and analyzed, have access to data collected about them, and have a way to challenge or correct inaccuracies, too.
To prevent abuses, video analytics should not be used to collect identifiable information en masse or merely for seeking out "suspicious" behavior, the ACLU said. Data collected should also be handled with care and systems should make decisions transparently and in ways that don't carry legal implications for those tracked, the group said.
Businesses should be governed by similar guidelines and should be transparent in how they use video analytics, the group said. Regulations governing them should balance constitutional protections, including the rights to privacy and free expression.
While the ACLU is ringing alarm bells about the use of video analytics now, it's anyone's guess how quickly the technology will advance.
"The joke in AI is that you ask a bunch of AI researchers, 'When are we going to achieve AI?' and the answer always has been, 'In 30 years,'" Vondrick said.