A gray-haired man walks through the office lobby, coffee cup in hand, staring ahead as he walks through the entrance.
He seems to be unaware that he is being watched by a network of cameras that can detect not only where he has been, but also who has traveled with him.
Surveillance technology has long been able to identify you. Now, with the help of artificial intelligence, it is trying to figure out who your friends are.
With a few clicks, this “co-appearance” or “correlation analysis” software can find anyone who has appeared on the surveillance frame within minutes of a gray-haired man over a month first, eliminate those who may have been near him once or twice, and focus on a man who has appeared 14 times. The software can instantly bookmark potential interactions between two men, now considered likely associates, on a searchable calendar.
Vintra, the San Jose-based company that introduced the technology during an industry video presentation last year, sold the co-occurrence feature as part of a range of video analytics tools. The company boasts on its website its relationship with the San Francisco 49ers and the Florida police department. The Internal Revenue Service and additional police departments across the country paid for Vintra’s services, according to a government contract database.
While co-emergent technology has been used by authoritarian regimes like China, Vintra appears to be the first to market it in the West, industry experts say. know.
But the company is one of many new AI and surveillance apps it’s experimenting with with little public scrutiny and few formal protections against invasions of privacy. For example, in January, New York state officials criticized the company that owns Madison Square Garden for using facial recognition technology barring employees of law firms that have sued the firm from attending arena events.
Industry experts and watchdogs say that if the co-occurrence tool is not currently being used — and one analyst expresses certainty that it is being used — it could become worthwhile. more reliable and more widely available as artificial intelligence capabilities evolve.
No unit doing business with Vintra contacted by The Times admitted to using the co-occurrence feature in Vintra’s software package. But some have not explicitly ruled it out.
The Chinese government, which is most active in using surveillance and AI to control its population, uses co-occurrence searches to spot protesters and dissidents by how to merge video with a vast network of databases, something Vintra and its customers wouldn’t be able to do, said Conor Healy, IPVM’s director of government research, the surveillance research group that organized the session. Vintra’s presentation last year. Vintra’s technology could be used to create “a more basic version” of the Chinese government’s capabilities, he said.
Several state and local governments in the United States restrict the use of facial recognition, especially in policing, but no federal law applies. According to Clare Garvie, an expert in the field, there is no law that prohibits police from using co-appearance searches like Vintra’s, “but it’s an open question” whether doing so would violate whether the rights to freedom of assembly are constitutionally protected. surveillance technology with the National Association. of Criminal Defense Attorneys.
Few states have any restrictions on how private entities use facial recognition.
The Los Angeles Police Department ended its predictive policing program, known as PredPol, in 2020 amid criticism that the program failed to prevent crime and led to tighter controls over Black and Latino neighborhoods. The program used artificial intelligence to analyze huge troves of data, including suspected gangs, in an effort to predict in real time where property crimes might occur.
In the absence of national law, many police departments and private companies must weigh the balance between security and privacy on their own.
Senator Edward J. Markey, a Massachusetts Democrat, said: “This is the future of the Orwellians coming to life. “A profoundly alarming state of surveillance where you are tracked, flagged, and classified for use by public and private sector organizations—without your knowledge.”
Markey plans to reintroduce a bill in the coming weeks that would block the use of facial recognition and biometric technology by federal law enforcement and require local and state governments to ban it. them as a condition of obtaining federal aid.
Now, some parts say they don’t have to make a choice because of reliability concerns. But as technology advances, they will.
Vintra executives did not return multiple calls and emails from The Times.
But the company’s CEO, Brent Boekestein, spoke highly of the technology’s potential use during a video presentation with IPVM.
“You can go up here and create a target, based on this guy, and then see who this guy hangs out with,” says Boekestein. “You can really start building a network.”
He added that “96% of the time, there are no events that security is interested in but there is always information the system is generating.”
The four San Jose transit station sharing agencies used in Vintra’s presentation denied that their cameras were used to make the company’s video.
The two companies listed on Vintra’s website, 49ers and Moderna, which makes one of the most widely used COVID-19 vaccines, did not respond to emails.
Several police departments admitted to working with Vintra, but none explicitly said they had carried out a co-occurrence search.
Brian Jackson, assistant sheriff in Lincoln, Neb., says his department uses Vintra software to save time analyzing hours of video by quickly finding patterns like blue cars and other items. Other objects matching the description are used to solve specific crimes. However, the cameras his division is associated with — including Ring cameras and those used by businesses — aren’t good enough for face matching, he said.
“There are limitations. It’s not a magic technology,” he said. “It requires correct input for good output.”
Jarod Kasner, an assistant manager in Kent, Washington, says his department uses Vintra software. He said he wasn’t aware of the co-occurrence feature and would have to look into whether it was legal in his state, one of the few that restricts the use of facial recognition.
“We’re always looking for technology that can help us because it’s a powerful force” for a department struggling with staffing issues, he said. But “we just want to make sure we’re staying within the boundaries to make sure we’re doing it right and professionally.”
The Lee County Sheriff’s Office in Florida said it only uses the Vintra software on suspects and does not “to track people or vehicles that are not suspected of criminal activity.”
The Sacramento Police Department said in an email that it uses Vintra software “sparsely, if at all,” but did not specify whether it has ever used co-occurrence.
“We are in the process of reviewing our Vintra contract and whether to continue using their services,” the department said in a statement, adding that it could not specify the schools. case that this software has helped solve the case.
The IRS said in a statement that it uses Vintra software “to more effectively review lengthy footage for evidence while conducting criminal investigations.” Officials would not say whether the IRS used the co-appearance tool or where the IRS placed the cameras, only knowing that the IRS followed “established agency procedures and processes.” ”
Jay Stanley, an attorney for the American Civil Liberties Union who first highlighted Vintra’s video presentation last year in a blog post, said he was not surprised that some companies and ministries be cautious about its use. In his experience, police departments often deploy new technology “without telling, let alone asking permission from democratic watchdog agencies like city councils.”
Stanley warned that the software could be misused to spy on personal and political associations, including with potential close partners, labor activists, anti-police groups or other groups. partisan opponents.
Danielle VanZandt, a Vintra analyst for market research firm Frost & Sullivan, says the technology is already in use. Because she reviewed confidential documents from Vintra and other companies, she is subject to non-disclosure agreements that prohibit her from discussing individual companies and governments that may be using them. use software.
Retailers, which already collect huge data on people who walk into their stores, are also testing the software to determine “what else can it tell me?” VanZandt said.
That could include identifying family members of the bank’s best customers to ensure they are treated well, a use that increases the likelihood of those who are not wealthy or well-connected. family will receive less attention.
“Concerns about that bias are huge in the industry,” says VanZandt, and are being actively addressed through standards and testing.
Not everyone believes the technology will be widely adopted. Florian Matusek of Genetec, a video analytics firm that works with Vintra, says law enforcement and corporate security often discover that they can use less invasive technologies to collect similar information. That includes scanning ticket entry systems and mobile phone data that have unique features but are not tied to individuals.
“There is a big difference between, like, product boards and demo videos, and what is actually being deployed in the field,” Matusek says. “Users often find that other technology can also solve their problems without having to go through or jump through all the rounds of installing cameras or dealing with privacy regulations.”
Matusek said he is not aware of any Genetec customers using co-occurrence, which his company does not provide. But he couldn’t rule it out.
2023 Los Angeles Times.
Distributed by Tribune Content Agency, LLC.
quote: The next step in surveillance AI: Find out who your friends are (2023, March 3) retrieved March 3, 2023 from https://techxplore.com/news/2023-03-surveillance -ai-friends.html
This document is the subject for the collection of authors. Other than any fair dealing for private learning or research purposes, no part may be reproduced without written permission. The content provided is for informational purposes only.