The End of Reality? How to combat deepfakes in our legal system
Photo illustration by Sara Wadford/ABA Journal/Shutterstock.
There’s nothing fake about it. The legal industry is facing a big problem with deepfakes. Courtrooms are not yet flooded with a tsunami of deepfake evidence, but with this artificial intelligence-generated technology playing with great success on social media and in fraud schemes, it’s only a matter of time before deepfakes regularly drop into the exhibit list.
Cases with a deepfake component already include:
- United States v. Reffitt: A defendant’s counsel argued that the prosecution’s evidence could be a deepfake.
- Sz Huang v Tesla: A defendant’s counsel argued that video evidence of a party’s principal’s statements could be a deepfake.
- United States v. Doolin: A court allowed introduction of video evidence that the defense argued could have been a deepfake.
- Al-Qarqani v. Chevron Corp.: An attorney submitted an exhibit reciting an allegedly false timeline of facts published in a newspaper that does not exist.
According to March 2023 article the New York Times, many of the apps and tools used to create deepfakes are available to anyone with a smartphone and are free or inexpensive, making it far too easy to create or alter digital evidence.
Amplifying the problem is the fact that “right now, truth is starting to become a matter of degree,” says Maura R. Grossman, a research professor at the University of Waterloo and an attorney and e-discovery special master. Grossman points to examples, such as touching up an exhibit photo to turn a frown into a smile. This small change may be immaterial to a case—or not.
What about changing a few pixels to make it impossible to tell whether someone is holding a phone or a gun in their hand? Materially altered photo and video evidence will be thrown out, but misinformation based on reconfigured reality will undoubtedly be introduced into the courtroom and to juries.
“So much of the justice system relies on interpreting evidence and deciding how much weight to give it,” Grossman says. “And we’re now moving into a world where not only can we no longer rely on our senses to do that, but we may need experts, and that changes the cost. And for the judge, that creates delays and adds a whole new layer.”

Is the most efficient solution a technical or a legal solution?
The legal world is grappling with how to handle deepfake evidence, with procedural solutions currently getting public attention as technology solutions develop more slowly. Here are three approaches on how evidence suspected of being deepfake can be handled in the legal system.
1. Technical experts
Digital forensic experts use machine-learning capabilities in AI-based detection systems to inspect the authenticity of digital media. Deepfake videos with audio are the most difficult to identify because of the human tendency to overlook small discrepancies in a video and focus on the main idea.
Digital forensic experts can apply multimodal analysis to examine multiple data sources and combine techniques. These capabilities range from artifact detection, to frame-by-frame analysis and blink analysis, to luminance gradient analysis and pixel error analysis. After conducting necessary analyses, the expert can render an opinion as to whether the evidence is authentic or altered—or not—based on irregularities found.
Hiring a digital forensic expert can cost from a few hundred dollars for hourly consulting to several thousand dollars per project. For high-profile cases with significant legal implications, fees can be much greater.
Jerry Bui of Texas-based Right Forensics is also a consultant to Interpol. He tells us that, “Deepfakes force us to confront an uncomfortable truth: Seeing is no longer believing. As forensic experts, we’re not just authenticating evidence—we’re trying to safeguard the integrity of the justice system in an era where digital manipulation can rewrite reality.”

2. Court rules
U.S. courts are slowly moving to address deepfake evidence. At the Nov. 8, 2024, meeting of the Advisory Committee on Evidence Rules, a committee of the Judicial Conference of the United States, the committee considered proposed Rule 901(c), authored by Grossman and Judge Paul Grimm, a retired federal judge and professor at the Duke University School of Law.
The rule, if adopted, would govern “potentially fabricated or altered electronic evidence,” reading: “If a party challenging the authenticity of computer-generated or other electronic evidence demonstrates to the court that a jury reasonably could find that the evidence has been altered or fabricated, in whole or in part, by artificial intelligence, the evidence is admissible only if the proponent demonstrates that its probative value outweighs its prejudicial effect on the party challenging the evidence.”
Grimm and Grossman’s proposed rule places burdens on the challenging and the offering parties, as well as the courts, helping to reduce the risk of exposing juries to deepfakes. It is one of several being considered by various judicial committees. Some experts think that no changes are needed to the rules of evidence.
Given the speed at which deepfake technology evolves and improves, changes in rules or procedures might not be useful. In the meantime, decisions will be meted out by the courts based on individual challenges as they arise.
3. Procedural approaches
Courts will use the existing rules to make decisions for the foreseeable future, having hearings on evidence. Judges will have to agree to analyze digital evidence, putting the burden on litigants to prove the legitimacy of the evidence in question, rather than placing the onus on the judge to decide whether the evidence is genuine or deepfake, admissible or not.
One of the critical issues that arises when considering the legal impact of deepfakes is cost—who pays to prove whether evidence is real or fake?
“This becomes an access-to-justice issue,” says Rebecca Delfino, the associate dean of clinical programs and experiential learning at the Loyola Law School at Loyola Marymount University. “In a perfect world, it would be taken care of in a criminal case. If the government wants to prove an audio-visual image is of the defendant robbing a bank, and the defendant claims it’s a deepfake, the government should have to pay for expert analysis because the burden of proof is on the prosecution. That may or may not happen; It depends on the available resources.
“But in the civil context, it’s going to be a significant problem,” Delfino continues. “Even for a simple example—such as the expression in a photograph being digitally altered from a frown to a smile—the individual will need to retain some type of expert.”
In family court, with its many pro se litigants, yet another reality exists—that a photo showing bruises would change someone’s life. Is it real or a deepfake? Who pays the expert to analyze it?
It’s just one of a multitude of questions that remains to be answered on how the legal world will adapt to a rising tide of deepfake evidence. We may not be at the end of reality, but deepfakes are definitely going to rock the legal world as we know it.
Practice tips
- Look for items of evidence that are too good or too damaging to be true.
- Deepfakes in social media tend to be video, audio and pictures. You can easily get automated optical character recognition to find text in pictures. You can also get machine transcription of audio and video. Each of these yields searchable text that you can use as a starting point to uncover suspicious material.
- Plan your deposition or trial, so that you have your exhibit lists ready earlier than you do now. Implore the court to require your adversaries to do the same.
- Evaluate the exhibits, paying close attention to those with audio, video or picture formats. Interview your witnesses, and challenge suspicious content. Don’t wait until the day of testimony.
- Be prepared to engage a computer forensic examiner to evaluate the evidentiary quality of an item suspected of deepfake.
See also:
Is the legal system ready for AI-generated deepfake videos?
Chuck Kellner is a strategic discovery adviser at Everlaw. Kellner has worked as an expert on e-discovery protocols, proportionality and cost of e-discovery, findings on computer forensic examination, and requirements for defensible search and review.
Mind Your Business is a series of columns written by lawyers, legal professionals and others within the legal industry. The purpose of these columns is to offer practical guidance for attorneys on how to run their practices, provide information about the latest trends in legal technology and how it can help lawyers work more efficiently, and strategies for building a thriving business.
Interested in contributing a column? Send a query to [email protected].
This column reflects the opinions of the author and not necessarily the views of the ABA Journal—or the American Bar Association.