The National Pulse

Predictive policing may help bag burglars--but it may also be a constitutional problem

  •  
  •  
  •  
  •  
  • Print.

image

Illustration by Viktor Koen

Imagine a police officer at roll call. He gets a printout stating that at a certain time, on a particular city block, there’s a certain percentage chance that a burglary will take place. Motivated by the odds, the officer heads over to that neighborhood around that very time. While there, he spots a man carrying a black bag.

Does the printout, combined with the officer’s observations, amount to reasonable suspicion such that the man could be appropriately stopped and searched? That’s just one example of the constitutional questions raised by the new, data-based approach to public safety known as predictive policing.

Named by Time magazine as one of the 50 best inventions of 2011, predictive policing uses crime-mapping software to leverage analytics, deploy personnel and take other targeted measures—such as installing lights or establishing a community patrol program—all in an effort to reduce crime and recidivism. The idea is to compile past crime details, run them through algorithms and identify future hot spots of specific crimes, such as burglary, down to individual blocks or even smaller areas.

“Crime is extremely ordinary behavior,” says George Mason University professor Cynthia Lum, a former Baltimore police officer who is now studying evidence-based policing. “It’s predictable. It’s most likely to occur tomorrow where it occurred yesterday. We know that about offenders too: People who commit crimes are likely to commit them again.”

Predictive policing analyzes criminal cycles, patterns, trends, behavior and relationships to aid law enforcement in making policy decisions and effectively establishing priorities for personnel and other scarce resources. Although crime analysis has long been used for offenses such as fraud and identity theft, predictive policing’s technology-based methods are being applied to property crimes. As the software and its application grow more sophisticated, the practice will spread to drug crimes, gang violence and even terrorism.

FOURTH AMENDMENT CONCERNS

Yet some caution that predictive policing raises troubling privacy and civil liberties questions. For example, it could be used to stig-matize neighborhoods or stop and frisk people in areas identified by something as unintuitive and inflexible as software. “No court has yet ruled on the impact of predictive policing,” says Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia and a leading academic on predictive policing. “Is it like an informant tip? Or profiling? Very soon, I predict we’ll see a Fourth Amendment case before the court.”

By most accounts, the term itself was coined by Los Angeles Police Chief Charles Beck in a 2009 issue of The Police Chief magazine. Since then, municipalities as diverse as Charleston, S.C., Memphis, Tenn., and Santa Cruz, Calif., have implemented predictive policing software.

Adopting this state-of-the-art technology often results in a public relations boost for police departments and, not surprisingly, the practice has spawned a multimillion-dollar industry. Many big players are creating customizable software. IBM, for example, developed the popular SPSS software for statistical analysis. Similarly, the software company PredPol uses the same algorithms that seismologists use to predict earthquake aftershocks.

But George Mason’s Lum doesn’t think predictive policing is all that different from conventional crime-prevention strategies. When asked to define it, she says: “That in and of itself is the question. Before predictive policing, there was crime analysis—which was a good, common tool—through which analysts studied patterns, trends, repeat offenders, modus operandi. But it remains to be seen whether the algorithms and the software used in predictive policing result in being able to better prevent and reduce crime compared to the use of crime analysis techniques more generally.”

Lum attended the National Institute of Justice’s first predictive policing symposium in 2009, at which she observed “a lot of energy and a lot of people,” including academics, police, crime analysts and community members. “Frankly, it was surprising that it was being introduced as something new,” she recalls. “Crime analysts had already been doing this. The main tool is similar: statistical analysis. Predictive policing is a buzzword for the use of mathematics, statistics and algorithms to find out where and when crime might occur next. But it remains to be seen whether it’s any more useful than tools already used.”

Yet according to Brigham Young University law professor Shima Baradaran, a co-chair of the ABA’s Crime Prevention, Pretrial and Police Practices Committee of the Criminal Justice Section, great advances in predictive policing have resulted from improved empirical techniques and refined application. And this practice can also be applied to offender monitoring.

“Judges can better predict who will commit crimes when released on bail,” which means judges can release more people while still reducing crime rates, says Baradaran, who co-wrote a 2012 Texas Law Review article that addresses the application of predictive analysis to pretrial detention considerations. “Police can be used more efficiently if crime rates and recidivism are tracked. For instance, if particular parts of a certain city are more burdened with crime during certain times of the day or week or year, police can plan and try to avoid this.”

Certainly, if the predictive algorithms work, it’s unquestionably beneficial for police departments, says the UDC’s Ferguson. But that’s the primary concern: whether they actually work. So far, predictive policing has been applied to property crimes such as burglary, car theft and thefts from cars—which, he explains, come with “decades of social science research.” But it hasn’t yet been routinely applied to violent crimes like gang violence or shootings.

MINDING THE DATA

Then there’s the tricky issue of oversight. Predictive policing is only as good as the officers and analyzers who handle the data, Ferguson says. And differences in data collection can affect the results, projecting a certain outcome or tipping priorities toward certain crimes, for example. “Who polices the predictive policing?” Ferguson asks. “There must be transparency and accountability.” But the software and its application can become proprietary, “so we don’t know how data is inputted or analyzed” or whether it’s being manipulated.

In Memphis, for example—an early adopter of Blue CRUSH (Crime Reduction Utilizing Statistical History), which uses IBM’s technology—an audit revealed that reams of data were never inputted into the software. The discrepancy was discovered long after the city had claimed the practice helped reduce serious crime by more than 30 percent since 2006.

To prevent this, the National Institute of Justice insists that those supervising predictive policing must be diligent in overseeing the information-gathering process and determining what is retained in databases. They must also “distinguish intelligence from information, which determines what is and is not protected under privacy laws,” according to an NIJ paper, which also recommends developing policies about what information can be shared with other agencies.

For its part the LAPD, the predictive policing pioneer, conducts double-blind testing of its technologies to prevent flawed data, according to Ferguson. “They’ve been careful. They use rigorous discipline. But there’s a concern that other jurisdictions might not use the same cautionary steps.”

And then there are related constitutional concerns, such as whether the officer can stop and search the guy carrying the black bag on the corner he’s patrolling because of the printout he got at roll call. Some argue that predictive policing can result in a reduced reasonable-suspicion standard, which can trigger racial or class profiling or other discriminatory approaches. In the worst cases, predictive policing can be used to target minorities, including juveniles, resulting in minorities being run out of neighborhoods or disproportionately arrested—all equal-protection concerns, Baradaran adds.

From Lum’s perspective, what’s important is not just predicting and preventing crime, but police departments building their capacity to achieve those goals within constitutional limits. “It isn’t just about predicting the hot spot and getting there,” she explains. “What also matters is what they do while they are there. To be effective, officers not only have to be proactive and preventative; they also have to do so and at the same time protect rights and treat people with dignity and respect. That balance is at the heart of American criminal justice.”

Give us feedback, share a story tip or update, or report an error.