Artificial intelligence predicted case outcomes with 79% accuracy by analyzing fact portrayal
Researchers were able to predict the results of human rights cases with 79 percent accuracy by using artificial intelligence to analyze the factual sections of published human rights judgments.
The study, published in PeerJ Computer Science, found that the outcomes were best predicted by analyzing the “circumstances” section of a case—which includes factual background—along with the topics covered by the case and the language used, according to a press release. Publications covering the findings include the Wall Street Journal Law Blog, Law.com (sub. req.), the Guardian and Motherboard.
The researchers examined 584 cases before the European Court of Human Rights with a machine-learning algorithm. They found that the court’s judgments were highly correlated to facts rather than legal arguments.
Ideally, the researchers said, they would use their algorithm to test applications to the court rather than published judgments, but they didn’t have access to that data. Assuming a similarity between chunks of text in published judgments and applications and briefs, the research could be used to predict outcomes before judgment, the study says.
The findings could help prioritize cases and identify which cases are most likely to be violations of the European Convention on Human Rights, according to the researchers. “We don’t see AI replacing judges or lawyers,” said University College London computer scientist Nikolaos Aletras in the press release. Also working on the study were academics from the University of Sheffield and the University of Pennsylvania.
The researchers acknowledge that the circumstances section of a case is not a neutral statement of the facts. The section could contain the court’s judgments about what is important, and could be tailor-made to reach a certain outcome. It’s also possible, the researchers say, that judges were reacting to the facts because the cases had been selected for their indeterminate legal issues.