The Mystery of AI Gunshot-Detection Accuracy Is Finally Unraveling

This week, New York City’s comptroller published a similar audit of the city’s ShotSpotter system showing that only 13 percent of the alerts the system generated over an eight-month period could be confirmed as gunfire. The auditors noted that while the NYPD has the information necessary to publish data about ShotSpotter’s accuracy, it does not do so. They described the department’s accountability measures as “inadequate” and “not sufficient to demonstrate the effectiveness of the tool.”

Champaign and Chicago have since canceled their contracts with Flock Safety and SoundThinking, respectively.

“Raven is over 90 percent accurate at detecting gunshots with around the same accuracy percentage at detecting fireworks,” Josh Thomas, Flock Safety senior vice president of policy and communications, tells WIRED in a statement. “And critically, Raven alerts officers to gun violence incidents they never would have been aware of. In the San Jose report, for example, of the 111 true positive gunshot alerts, SJPD states that only 6 percent were called in to 911.”

Eric Piza, a professor of criminology at Northeastern University, has conducted some of the most thorough studies available on gunshot detection systems. In a recent study of shooting incidents in Chicago and Kansas City, Missouri, his team’s analysis showed that police responded faster to shooting incidents, stopped their vehicles closer to the scene of shootings, and collected more ballistic evidence when responding to automated gunshot alerts compared to 911 calls. However, there was no reduction in gun-related crimes, and police were no more likely to solve gun crimes in areas with gunshot sensors than in areas without them. That study only examined confirmed shootings; it did not include false-positive incidents where the systems incorrectly identified gunfire.

In another study in Kansas City, Piza found that shots-fired reports in areas with gunshot sensors were 15 percent more likely to be classified as unfounded compared to shots-fired reports in areas without the systems, where police would have relied on calls to 911 and other reporting methods.

“If you look at the different goals of the system, research shows that [gunshot detection technology] typically tends to result in quicker police response times,” Piza says. “But research consistently has shown that gun violence victimization doesn’t reduce after gunshot detection technology has been introduced.”

The New York City comptroller recommended the NYPD not renew its current $22 million contract with SoundThinking without first conducting a more thorough performance evaluation. In its response to the audit, the NYPD wrote that “non-renewal of ShotSpotter services may endanger the public.”

In its report, San Jose’s Digital Privacy Office recommended that the police department continue looking for ways to improve accuracy if it intends to keep using the Raven system.

Pointing to the report’s finding that only 6 percent of the confirmed gunshots detected by the system were reported to police via 911 calls or other means, police spokesperson Sergeant Jorge Garibay tells WIRED the SJPD will continue to use the technology. “The system is still proving useful in providing supplementary evidence for various violent gun crimes,” he says. “The hope is to solve more crime and increase apprehension efforts desirably leading to a reduction in gun violence.”

aiartificial intelligenceasATauditcontractCrimedataericfinallyFireworksGunshopeinformationitNew YorkNew York CitynycOfficeotheroverpolicePolicyprivacyravenreportsresearchSafetysecuritySecurity / PrivacySecurity / Security NewsShootingShowssurveillanceTechnologythatthethomasVehiclesviolencewellyou
Comments (0)
Add Comment