A gray-haired man walks down an office lobby with a cup of coffee in hand, staring straight ahead as he passes the entrance.
He seems unaware that he is being tracked by a network of cameras that can detect not only where he has been but also who has been with him.
Surveillance technology has been able to identify you for a long time. Now, with the help of artificial intelligence, try to find out who your friends are.
With a few clicks, this “co-appearance” or “correlation analysis” software can find anyone who has appeared in surveillance frames within minutes of the grizzled man over the past month, weed out those who may have been close of the. once or twice, and focus on a man who has appeared 14 times. The software can instantly mark potential interactions between the two men, now considered potential partners, on a searchable calendar.
Vintra, the San José-based company that showcased the technology at a industry video presentation last year, it sold the co-appearance feature as part of a variety of video analytics tools. the company boast on your website about relations with the San Francisco 49ers and a Florida police department. The Internal Revenue Service and other police departments around the country have paid for Vintra’s services, according to a government contracting database.
Although the co-appearance technology is already used by authoritarian regimes like China’s, Vintra appears to be the first company to commercialize it in the West, say industry specialists.
But the company is one of many testing new artificial intelligence and surveillance applications with little public scrutiny and few formal guarantees against invasions of privacy. In January, for example, New York state officials criticized the firm that owns Madison Square Garden for using facial recognition technology to bar employees of law firms that have sued the company from attending events at the arena.
Industry experts and watchdogs say that if the co-appearance tool is not in use now, and one analyst expressed certainty that it is, it will likely become more reliable and available as the developments progress. artificial intelligence capabilities.
None of the entities that do business with Vintra that were contacted by The Times acknowledged using the joint appearance feature in Vintra’s software package. But some did not explicitly rule it out.
China’s government, which has been the most aggressive in using surveillance and AI to control its population, uses joint appearance searches to detect protesters and dissidents by fusing video with a vast network of databases, something that Vintra and its clients could not do. said Conor Healy, director of government research at IPVM, the surveillance research group that organized the Vintra unveiling last year. Vintra’s technology could be used to create “a more basic version” of Chinese government capabilities, he said.
Some US state and local governments restrict the use of facial recognition, especially in law enforcement, but no federal law applies. There are no laws prohibiting police from using joint searches like Vintra’s, “but it’s an open question” whether doing so would violate constitutionally protected rights of free assembly and protections against unauthorized searches, according to Clare Garvie, a surveillance technology specialist. with the National Association of Criminal Defense Lawyers. Few states have restrictions on how private entities use facial recognition.
The Los Angeles Police Department ended a predictive surveillance program, known as PredPol, in 2020 amid criticism that it was not stopping crime and led to more stringent policing of Black and Latino neighborhoods. The program used AI to analyze vast amounts of data, including suspected gang affiliations, in an effort to predict in real time where property crimes might occur.
In the absence of national laws, many police departments and private companies have to weigh the balance between security and privacy on their own.
“This is the Orwellian future come true,” said Sen. Edward J. Markey, D-Massachusetts. “A deeply alarming state of surveillance where you are tracked, marked and categorized for use by private and public sector entities, of which you are unaware.”
Markey plans to reintroduce a bill in the coming weeks that would stop the use of biometric and facial recognition technologies by federal law enforcement and require state and local governments to ban them as a condition of getting federal grants.
For now, some departments say they don’t have to make a choice for reliability reasons. But as technology advances, they will.
Vintra executives did not return multiple calls and emails from The Times.
But the company’s CEO, Brent Boekestein, was expansive about the potential uses of the technology during the video presentation with IPVM.
“You can go up here and create a target, based on this guy, and then see who this guy is dating,” Boekestein said. “You can really start to build a network.”
He added that “96% of the time, there is no event that security is interested in, but there is always information that the system is generating.”
Four agencies that share the San José transit station used in Vintra’s presentation denied that their cameras were used to make the company’s video.
Two companies listed on Vintra’s website, the 49ers and Moderna, the pharmaceutical company that produced one of the most widely used COVID-19 vaccines, did not respond to emails.
Several police departments acknowledged working with Vintra, but none explicitly said they had conducted a joint summons search.
Brian Jackson, deputy police chief in Lincoln, Neb., said his department uses Vintra software to save time analyzing hours of video by quickly looking for patterns like blue cars and other objects that match descriptions used to solve specific crimes. But the cameras his apartment connects to, including Ring cameras and those used by businesses, aren’t good enough at matching faces, he said.
“There are limitations. It’s not magic technology,” she said. “It requires precise inputs to get good results.”
Jarod Kasner, deputy chief in Kent, Wash., said his department uses Vintra software. He said he wasn’t aware of the co-appearance feature and that he would have to consider whether it was legal in his state, one of the few that restricts the use of facial recognition.
“We’re always looking for technology that can help us because it’s a force multiplier” for a department struggling with personnel problems, he said. But “we just want to make sure we’re within the limits to make sure we’re doing it right and doing it professionally.”
The Lee County Sheriff’s Office in Florida said it uses the Vintra software only on suspects and not “to track people or vehicles that are not suspected of any criminal activity.”
The Sacramento Police Department said in an email that it uses the Vintra software “sparingly, if at all,” but did not specify whether it had ever used the co-appearance feature.
“We are in the process of reviewing our contract with Vintra and whether to continue to use their service,” the department said in a statement, which also said it could not point to instances where the software helped solve crimes.
The IRS said in a statement that it uses Vintra software “to more efficiently review long video footage for evidence while conducting criminal investigations.” Officials did not say whether the IRS used the joint appearance tool or where it placed the cameras, only that it followed “established agency protocols and procedures.”
Jay Stanley, a lawyer with the American Civil Liberties Union who first highlighted Vintra’s video presentation last year at a blog postHe said he’s not surprised that some companies and departments are cautious about their use. In his experience, police departments often implement new technology “without telling, let alone asking, permission from democratic overseers like city councils.”
The software could be abused to monitor personal and political associations, including with potential intimate partners, labor activists, anti-police groups or partisan rivals, Stanley warned.
Danielle VanZandt, who analyzes Vintra for market research firm Frost & Sullivan, said the technology is already in use. Because she reviewed confidential documents from Vintra and other companies, she is bound by confidentiality agreements that prohibit her from speaking about individual companies and governments that may be using the software.
Retailers, who are already collecting a lot of data about the people who walk into their stores, are also testing the software to determine “what else can it tell me?” Van Zandt said.
That could include identifying the family members of a bank’s best customers to ensure they are treated well, a use that raises the possibility that those without wealth or family connections will receive less attention.
“Those bias concerns are huge in the industry” and are being actively addressed through standards and testing, VanZandt said.
Not everyone believes that this technology will be widely adopted. Law enforcement and corporate security officers often find they can use less intrusive technologies to obtain similar information, said Florian Matusek of Genetec, a video analytics company that works with Vintra. That includes scanning ticket entry systems and cell phone data that have unique characteristics but are not tied to individuals.
“There’s a big difference between product sheets and demo videos and things that get implemented in the field,” Matusek said. “Users often find that another technology can solve their problem just as well without going through all the hoops of installing cameras or dealing with privacy regulation.”
Matusek said he was not aware of any Genetec customers using the joint appearance, which his company does not provide. But he couldn’t rule it out.