Litigation

Fishing for Bias: Wildlife research techniques find gaps in court record

  •  
  •  
  •  
  •  
  • Print.

Fishing for Bias

Edward Cheng. Photograph by David Mudd

Richard Leo is one of a handful of nationally recognized experts in the psychological science of how people are influenced and make decisions under duress. When defense attorneys began to call on him to testify in cases in which a confession might have been coerced, he says, prosecutors often fought vigorously to exclude his testimony.


“A few of us who were around in the early days can tell you even though the overwhelming majority of experts in our field were admitted to testify, prosecutors ... used their influence to try to prevent this kind of testimony,” Leo says.

Now Vanderbilt University law professor Edward Cheng says he’s discovered that publication bias in court records can conceal how often experts are allowed to testify, and he’s found a way to prove it.

“The published case law suggested that these experts were not legitimate, when in fact they were involved in a lot of cases,” Cheng says.

He began to research the question more than a decade ago after moderating a discussion in New York City with scholars who work in the field of wrongful convictions. He told the panel he could find no cases that admitted experts in the field into a New York criminal case. Panel experts said they testified regularly in such matters.

“At first, I was embarrassed that I was talking to these nationally recognized experts, and I had failed to adequately research their work,” he says. “But later, I realized that the real problem may be with the data.”

Publication bias is an insidious problem in scientific, medical and social science journals. Academics have found that scientific literature often ignores research unless it has a statistically significant result. Cheng claims a similar bias is found in legal opinions, especially around evidentiary rulings in criminal cases.

“I began to see that court opinions were over-reporting one type of outcome while failing to reflect others,” he says. “The problem is trying to prove something exists when there are no good records.”

To find out what’s going on, Cheng used a statistical method that’s used in ecological studies to compare wildlife samples to quantify the size of an animal or fish population. He used databases, including Westlaw, LexisNexis, Bloomberg BNA, Fastcase and Daubert Tracker, to compare reported cases that involved such expert testimony.

By comparing the various data sets, he was able to estimate how many times these experts actually testify in court.

“You only need fancy statistics like this when the data is really bad,” he says. “If we had more information, like lists of all expert witnesses or transcripts of their testimony, we wouldn’t need to resort to this approach.”

When Cheng crunched the numbers, the real admissibility rate for experts in this area was significantly higher than what is found in the case law. The model estimates that an observed 16 percent admissibility rate actually might be closer to 28 percent.

“When there is a fight over testimony, it is more likely to generate a written opinion, so if another judge or lawyer is looking at the body of the case law, they see more cases of exclusion rather than admission,” he says. “That means all the factors cut the same way, toward exclusion.”

MORE DATA, PLEASE

The heart of the problem is that evidentiary rulings rarely are recorded in trial court opinions. However, when there is a challenge to an evidentiary ruling, it often will be recorded in subsequent appellate rulings.

“There might be a handful of cases in which the court tackles the questions of whether exclusion was an abuse of discretion, which gets reported,” Leo says. “But it means hundreds of other cases in which experts are allowed to testify are never recorded.”

Cheng thinks more court records must be made available to the public. Leo suggests a national database of expert witnesses. But he wonders whether any group has the impetus, funds or other means to correct the issue. “There seems to be a clear imbalance in the way the legal system publishes records,” Leo says. “Unfortunately, I don’t see anyone within the system who is motivated enough to correct it.”


This article originally appeared in the December 2016 issue of the ABA Journal with this headline: "Fishing for Bias: Wildlife research techniques can help find gaps in the court record."

Give us feedback, share a story tip or update, or report an error.