Gunshot detection technology company voluntarily submitted itself for an audit after privacy concerns
Placing microphones around a city for the police can make residents uneasy, which makes life tough if that’s your business model.
ShotSpotter is a gunshot detection technology company that places microphones in high gun-crime areas. At the request of local governments, the company has deployed its technology in almost 100 American cities.
Its sensors can detect a gunshot quickly and lead to the deploymentof police without the need for a service call.
While ShotSpotter had privacy protections in place, they weren’t enough for some. The Toronto Police Service, for example, was considering a contract with the company until Ontario’s Ministry of the Attorney General and privacy/legal experts raised concerns. Toronto police decided against a ShotSpotter contract in February 2019.
What happened in Toronto was not a new experience for ShotSpotter. Believing concerns that its microphones could allow for eavesdropping on private conversations were largely because of a misunderstanding of the product, the company did something radical: It opened itself up to an independent privacy audit conducted by the Policing Project at the New York University School of Law.
ShotSpotter had been through financial and security audits, but a privacy audit was something new. The process gave researchers at the Policing Project access to ShotSpotter’s technology, contracts, employees and procedures to hunt for possible privacy issues.
Concluding that ShotSpotter created a “low privacy risk,” the report, published in July 2019, found no egregious privacy violations but recommended 11 areas for possible improvement, including keeping the precise location of sensors from local police, more vigorously denying data requests and challenging subpoenas, and updating its policy for sharing data with third parties.
ShotSpotter has adopted or is in the process of adopting all of the recommendations except one that sought to avoid placing sensors near sensitive locations such as health clinics and schools. The company found that such a ban would be impractical—especially if one of those locations was at or near a high gun-crime zone. “You never know what’s going to happen, so there was definite nervous anticipation about what they would say,” says Sam Klepper, senior vice president for marketing and product strategy at ShotSpotter, “but it was all well worth it.”
As policymakers and local communities grapple with democratic oversight of surveillance technology, ShotSpotter’s privacy audit could be a new way forward. Instigated by the company itself, the audit not only led to internal changes but sated concerns by government officials and privacy advocates. Now experts think this could be a step taken by other police technology companies to impact the adoption of surveillance technologies nationwide.
“I think the ShotSpotter audit is a really welcome development,” says Catherine Crump, director of the Samuelson Law, Technology & Public Policy Clinic at the University of California at Berkeley School of Law. “There has been far too little attention paid to the details of how surveillance technologies operate: what data they collect, how that data is shared, how that data is kept.”
ShotSpotter isn’t alone in exploring its ethical implications. Axon Enterprise, a police body camera company, had been weighing whether to adopt facial recognition technology. In a report last year, the company’s AI & Policing Technology Ethics Board, which included members of the Policing Project, decided not to, stating it “is not currently reliable enough to ethically justify its use on body-worn cameras.”
Best for business
While ShotSpotter’s approach to privacy has been largely lauded, it isn’t without risk. Specifically, accreditation and certification have the potential to whitewash a technology or company, says Farhang Heydari, executive director of the Policing Project. However, he notes that the Policing Project is not providing a “stamp of approval.” He adds that issues dogging ShotSpotter, like the product’s efficacy, are not addressed by the audit.
Crump at Berkeley adds that people need to interrogate who is undertaking and funding such audits. In the case of NYU Law’s work, ShotSpotter paid for the audit, and its CEO is on the Policing Project’s advisory board. (Disclosure: A Journal story by this author is cited in the audit.)
From a business perspective, the privacy audit already benefited ShotSpotter by helping gain approval from the Oakland, California-based Privacy Advisory Commission.
Founded in 2015, the commission is tasked with protecting citizens’ privacy rights, which includes reviewing government technology, contracts and procedures.
When its existing contract with the city was being reviewed by the commission, ShotSpotter came with the audit in hand.
“We haven’t had a vendor that’s gone so far out of its way to do everything correctly,” says Brian Hofer, a privacy advocate and chair of the commission. “They didn’t just do a privacy audit or just talk to the ACLU or just talk to experts. After, they amended their practices and really made these significant steps in the right direction.” Hofer hopes that other companies see a privacy audit as a competitive advantage.
By contrast, the commission was less forgiving in 2016 when it looked at cell site simulators, a technology used by law enforcement that compels identifying data from people’s phones.
Cell site simulators had previously been used by Oakland law enforcement, but in 2018 were not used at all, according to a recent report.
Meanwhile, Berkeley’s Crump argues as privacy audits become the industry standard, “it can also feed into local efforts to regulate surveillance technologies.” She points out that company-supported, independent privacy audits can have a multiplying effect across governments, regardless of whether they have privacy commissions. “I think what you are starting to see is an emerging system for helping communities take control of how surveillance technology is being used,” she says. “We’re very much at the beginning.”
This article was originally published in the April/May 2020 issue under the headline, “Open Book: Amid privacy concerns, a gunshot detection technology company voluntarily submitted itself for an audit—and reaped the benefits.”