NEW ONESYou can now listen to Fox News articles!
Artificial intelligence, or AI, is being used to secure sites from sports arenas to churches and schools. The technology is used to scan for weapons, including rifles, knives and explosives, as people walk between standing panels. If a weapon is detected, the guard is alerted.
Evolv . from Massachusetts has used the technology to scan about 300 million people across the country since the system went live in 2019, second only to the TSA.
“Remember to walk straight into a site, school, or building without stopping,” said Peter George, Evolv’s CEO, who touts the technology as much less intrusive than traditional metal detectors. “And if you don’t have a weapon with you, you can walk right in. And if you do, we can identify it.’
Evolv’s technology is used in large sports stadiums, urban hospitals, schools, courts and major casinos, among other locations.
“It’s a free non-contact weapons screening system,” explains Steve Morandi, Evolv’s vice president of product management. “It really works with a combination of AI, advanced sensors and cameras in a really integrated way. And we actually detect weapons versus everyday metal objects that we all carry.”
Bay State-based Liberty Defense has combined AI technology with 3D imaging capable of hunting down non-metallic threats such as: powders, pipe bombs or ghost guns made of plastic.
“We are looking for any type of anomaly, any type of threat that could be hidden,” explains Bill Frain, CEO of Liberty Defense. “So whether that’s a gun or a knife or a plastic explosive that can do damage or maybe even drugs or liquids.”
The new HEXWAVE system will be tested this summer at a Hindu temple near Atlanta, the University of Wisconsin and Toronto Pearson International Airport.
The proliferation of AI technology in security has alarmed critics.
“What we don’t want to see is… America turned into a checkpoint society where we’re searched every time we go to a public gathering at a church or another, a place of worship or a small league game or whatever,” said Jay Stanley, a senior policy analyst with the ACLU Speech, Privacy, and Technology Project. .
Regarding privacy issues, Frain notes, “We don’t store any of the data. No images are stored.”
George says, “We use our artificial intelligence to distinguish between a phone and the firearm, but we don’t really look at the people at all. We’re just looking for weapons.”