- James Clayton
- BBC, North American Tech Reporter
The ShotSpotter Incident Review Room is like any other call center.
Analysts with headsets sit by computer screens, listening intently.
However, the people who work here have an extraordinary responsibility.
They make the final decision on whether a computer algorithm has correctly identified a shot and whether they should send the police.
Making an incorrect call has serious consequences.
ShotSpotter has received a lot of negative press in the last year. The allegations range from inaccurate technology to claims that it is fueling discrimination in the police..
In the wake of this negative news, the company gave BBC News access to its national incident review center.
ShotSpotter is trying to solve a real problem.
“We believe that what makes the system so attractive is that 80% to 95% of shots go unreported,” says company CEO Ralph Clark.
People don’t report gunshots for a variety of reasons: They may be unsure of what they have heard, think someone else will call 911, or simply not trust the police.
So the founders of ShotSpotter had an idea. What if they could skip the 911 process entirely?
They came up with a system which consists of putting microphones around a neighborhood. When a loud bang is detected, a computer analyzes the sound and classifies it as a gunshot or something else.. Then a human analyst steps in to review the decision.
In the incident review room, former teacher Ginger Ammon allows me to sit with her as she discusses these decisions in real time.
Every time the algorithm marks a potential trigger, it makes a “ping” sound.
Ammon first listens to the recording herself and then studies the waveform it produces on her computer screen.
“We were looking to see how many sensors picked it up and if the sensors registered a directional pattern, because in theory a shot can only travel in one direction,” he says.
Once she is certain that a shot has occurred, Ammon presses a button that sends police officers to the scene.
Everything happens in less than 60 seconds.
“It feels like you’re playing a computer game,” I tell him.
“That is a comment that we receive frequently,” he responds.
There are clear examples that ShotSpotter works.
In April 2017, black supremacist Kori Ali Muhammad started a shooting in Fresno, California.
Trying to kill as many white men as possible, he walked through a residential neighborhood, choosing his targets.
911 was taking calls, but they weren’t specific.
Nevertheless, ShotSpotter could indicatethe a lto police Muhammad’s path.
After three minutes and three murders, Muhammad was captured.
Fresno police believe that without the ShotStopper, the man would have killed more people.
“ShotSpotter showed us the path he took,” says Lt. Bill Dooley.
The company has been highly successful in convincing law enforcement to adopt its technology.
Its microphones are in more than 100 cities across the United States, and for years, the technology was free of controversy.
All that changed with the murder of George Floyd, as people became interested in the technology that so many police forces were using.
ShotSpotter is too expensive for the police to deploy throughout a city.
Instead, the microphones are typically placed in downtown areas, i.e. in zones with the largest black population.
So if the technology is not as accurate as claimed, it could have a disproportionate impact on those communities.
Suddenly, ShotSpotter became the center of attention.
ShotSpotter claims to be 97% accurate. That would mean that law enforcement can be fairly confident that when a ShotSpotter alert occurs, they are almost certainly responding to a shot.
But that claim is exactly that, a claim. It’s hard to see how ShotSpotter knows it is so accurate, at least not with the public information it has released.
And if not, it could have far-reaching consequences for American justice.
The first problem with that accuracy claim is that it is often difficult to tell if a shot has occurred.
When the Chicago Inspector General investigated, he found that only 9% of ShotSpotter alerts had any physical evidence of a shot.
“It’s a low number,” says the city’s Deputy Inspector General for Public Safety, Deborah Witzburgh.
That means that in 91% of police responses to ShotSpotter alerts, it’s hard to say definitively that a gun was fired. That does not mean that there were no shots, but it is difficult to prove that there were.
The shots sound a lot like the popping of a firecracker.
So how come ShotSpotter is so sure it is almost 100% accurate? I ask Clark.
“We rely on the truth on the ground from the police to tell us when we miss or miss detections,” he tells me.
But critics say this methodology has a fundamental flaw. If the police are not sure whether a shot has occurred, they will not tell the company that it was wrong.
In other words, critics say, the company has been counting “don’t know,” “maybe” and “probably” as “hits.”
Brendan Max, a Chicago attorney, says the company’s accuracy claims are “marketing nonsense.”
“Customer feedback (in the case of ShotSpotter, from the police) is used to decide whether people like Pepsi or Coca-Cola better,” he says. “They are not designed to determine if a scientific method works.”
Conor Healy, who analyzes security systems for the IPVM video surveillance research group, is also deeply skeptical of the 97% accuracy figure.
“Putting the responsibility on the police to report every false positive means you expect them to report things, but when nothing has happened they are unlikely to do so,” says Healy.
“It is fair to assume that if they [ShotSpotter] they have solid data to back up their claims, they have every incentive to publish it, “he adds.
Increased use of weapons
Back in Fresno, I join the police on an evening walk with Police Officer Nate Palomino.
Fresno has one of the worst firearm crime rates in California, and like many other cities in the United States, it has gotten worse in the last two years.
Sure enough, a ShotSpotter alert arrives. Nevertheless, when we get to the place, police no bushings found the bullet and there is no physical evidence of a gunshot.
Officer Palomino tells me that the audio recording sounds like a gunshot, and it seems more than possible that it is, but it is difficult to prove.
He also says that this scenario is typical.
ShotSpotter’s accuracy should be beyond question.
It has been used in courts across the country as evidence to both defend and prosecute someone.
The concern is that if it’s not as accurate as claimed, ShotSpotter is sending police into situations where they mistakenly expect shots.
Alyxander Godwin, who has been campaigning to get rid of ShotSpotter in Chicago, sums up the concern.
“The police expect these situations to be hostile,” he says.
“They expect there to be a weapon, and because of where it is deployed, they expect a black or brown person to have a weapon,” he adds.
But ShotSpotter says there is no data to support this theory.
“What that would describe is a situation where officers come to the scene and basically shoot unarmed people,” Clark says.
“That is simply not in the data, it is speculation,” he adds.
However, he seems to also accept that the company’s own precision methodology has its limitations.
“It might be fair criticism to say ‘hey look, you’re not getting all the feedback you could possibly get,'” says Clark.
“That could be a fair criticism.”
Max, the Chicago attorney, says the ShotSpotter reports should not be allowed as evidence in court until the company can better support its claims.
“In the last four or five months, I am aware of dozens of Chicagoans who have been arrested based on the ShotSpotter evidence,” he says.
“I’m sure that has happened in cities across the country,” he says.
It also says that the company should open up its systems for better review and analysis.
For instance, Who is independently reviewing the quality of the analysts? And how often does the algorithm disagree with the human analyst?
Certainly, from the time I spent in the ShotSpotter incident review center, it is common for analysts to disagree with the classification of computers.
“It’s just filtering what we see,” says Ammon.
“But honestly, I don’t even look at it [la clasificación]I’m busy looking at the sensor patterns, “she explains.
It is an interesting admission. Technology is sometimes seen as all-seeing, all-knowing – the computer masterfully detects a shot.
But in practice, analysts play a much more important role than expected.
Attorneys like Brendan Max are interested in learning more about how technology works in court.
ShotSpotter has received a lot of criticism during 2020, not all of them are fair.
And much of the coverage casually omits the fact that police forces often give glowing reviews of the technology’s effectiveness.
The company is interested in highlighting cases where ShotSpotter has alerted police to shooting victims, for example to save lives.
In several cities across the United States, activists are trying to persuade cities to terminate ShotSpotter contracts.
But elsewhere, ShotSpotter is expanding.
In Fresno, Police Chief Paco Balderrama seeks to increase his coverage, at a cost of $ 1 million a year.
“What if ShotSpotter only saves one life in a given year? Is it worth a million dollars? I’d say yes,” he says.
The debate surrounding ShotSpotter is enormously complex and has significant potential ramifications for law enforcement in the United States.
The discussion is unlikely to disappear until the accuracy of the technology and data is independently verified.
Now you can receive notifications from BBC Mundo. Download the new version of our app and activate them so you don’t miss out on our best content.
Eddie is an Australian news reporter with over 9 years in the industry and has published on Forbes and tech crunch.