Legality of using predictive data to determine sentences challenged in Wisconsin Supreme Court case
The Wisconsin State Capitol.
In Wisconsin, 34-year-old Eric Loomis was arrested in February 2013 for driving a car that had been used in a shooting. Loomis, a registered sex offender, was then sentenced to six years in prison because a score on a test noted he was a “high risk” to the community.
Loomis challenged the use of the test’s score during his appeal in April and said it violated his right to due process of law because he was unable to review the algorithm and raise questions about it. The Wisconsin Supreme Court has yet to rule.
This test is called Compas, or Correctional Offender Management Profiling for Alternative Sanctions. Developed by private company Northpointe Inc., the Compas assessment is a secret algorithm used by authorities in Wisconsin that predicts the likelihood of criminal conduct and suggests the types of supervision given to inmates, the New York Times explained.
The news organization also reported Compas assessments are different for men, women and juveniles. However, the details of the test “are kept secret.” The Wall Street Journal described the test as having 137 questions covering topics that include criminal and parole history, social life, drug use, beliefs, and community ties.
This case raises questions about the use of predictive data in the criminal justice system and its reliability.
The use of algorithms is not limited to sentencing. Police in Chicago have used data to identify people who are likely to shoot or get shot, the New York Times recently reported, while authorities in Kansas City, Mo. have used data to identify possible criminals.
Some supporters of data use in the criminal justice system see algorithms as a way to free up space in crowded prisons by identifying those who pose little threat to communities. The New York Times said these proponents believe justice can be served more efficiently by using numerical evidence rather than “personal judgments.”
Consequently, since 2010, Pennsylvania’s sentencing commission has been developing algorithms to foresee whether a defendant is likely to be arrested again and to recommend sentence ranges, among other results. It is being developed by a public agency, and as such, the proposed algorithms (PDF) are public.
Still, Ezekiel Edwards, the director of the Criminal Law Reform Project at the American Civil Liberties Union, told the New York Times that data from the criminal justice system was frequently unreliable, and predictive data could heighten inequalities in the system if prejudices are present.
Edwards said: “We are kind of rushing into the world of tomorrow with big-data risk assessment without properly vetting, studying and ensuring that we minimize a lot of these potential biases in the data.”