Midyear Meeting

Pretrial risk-assessment tools should only be used if they're transparent and unbiased, warns ABA House

  •  
  •  
  •  
  •  
  • Print.

A pair of handcuffs made of numbers

Image from Shutterstock.

The ABA House of Delegates passed a resolution Monday urging governmental entities to refrain from using pretrial risk-assessment tools unless “the data supporting the risk assessment is transparent, publicly disclosed and validated to demonstrate the absence of conscious or unconscious racial, ethnic or other demographic, geographic or socioeconomic bias."

Resolution 700, which was approved as revised at the 2022 ABA Midyear Meeting by a vote of 322-30, also urges governmental bodies to recognize that an individual’s criminal history and other criteria reflected in risk-assessment tools or pretrial release evaluations “may reflect structurally biased application of laws, policies or practices.”

Additionally, this information may reflect “conscious or unconscious racial and ethnic or other demographic, geographic or socioeconomic bias on the part of law enforcement, prosecutor offices, judges and all other personnel utilizing risk-assessment tools in connection with pretrial release,” according to the resolution.

“The problem is the mathematical models, the risk assessments are only as good as the data that goes into them,” said Stephen Saltzburg, introducing the resolution on behalf of the ABA Criminal Justice Section. “This resolution recognizes that these pretrial assessments can be dangerous, although well intentioned.”

The resolution was co-sponsored by the Criminal Justice Section, the Section of Civil Rights and Social Justice and the National Bar Association.

JudgeJudge Denise Langford Morris spoke in favor of Resolution 700 on behalf of the National Bar Association.

“I believe the entire law enforcement community within the criminal justice system, including judges, prosecutors, defense attorneys, pretrial services agents and law enforcement will benefit from transparency in the process, policies and practices as these important tools are utilized and relied upon more and more here in America,” said Judge Denise Langford Morris, a Michigan judge who represented the National Bar Association.

The Resolution 700 report says the factors used to create algorithms reflect systemic racism in the criminal justice system. These factors include age at first arrest, number of prior convictions, prior incarcerated sentences, employment or education status and prior missed court appointments.

“Research has also proven that at every step of the pretrial (before adjudication) process, from the deployment of police resources, the decision to arrest, to the amount of money bail set, to changes in charges and to the use of diversion programs, the system disproportionately harms Black people and other marginalized people,” the resolution report says.

Follow along with the ABA Journal’s coverage of the 2022 ABA Midyear Meeting here.

According to a 2019 review of pretrial practices by the Pretrial Justice Institute, two out of three U.S. counties reported using pretrial assessment tools. Meanwhile, just 45% of counties with pretrial assessment tools reported having “validation studies to ensure continuing accuracy with regard to court appearance and public safety,” according to the Pretrial Justice Institute.

Resolution 700 urges governmental bodies to require that pretrial risk-assessment tools and pretrial release evaluations “undergo ongoing independent and objective evaluation and monitoring to determine whether they have had an adverse racial, ethnic or other demographic, geographic or socioeconomic impact and, if so, to require modifications to address such impact.”

“Our modern system for making pretrial release evaluations must be transformed to reflect a system that honors the presumption of innocence, the right to pretrial liberty and equal justice for all people,” the resolution report says.

Give us feedback, share a story tip or update, or report an error.