Judge discusses dangers of deepfakes at ABA Techshow
Judge Scott Schlegel was once a domestic violence prosecutor and saw all sorts of horrific things, but he’s never been more terrified than he is today. Schlegel, who was speaking at the ABA Techshow on Thursday, pressed play on a recording, and he immediately drew some horrified looks from the room.
“Think carefully before you cross me,” came Schlegel’s threatening voice, loud and clear. “Do you think anyone will believe you over me? I’m the judge.”
Schlegel, a 2021 ABA Journal Legal Rebel, stopped the recording and looked at his audience, smiling. This, he explains, is why every attorney and every judge should fear deepfakes. Anyone with access to someone’s voice can use it to create compelling evidence, either for or against them. If his wife were upset with him, Schlegel says, she could ask AI to create a threatening script of her husband, complete with details like his children’s and pets’ names—and more.
“This is what scares me and keeps me up at night,” Schlegel says, explaining that deepfakes could be used in sexual harassment claims, wrongful termination claims and domestic violence claims. The list is unlimited, he adds.
Schlegel also presented a video he made of himself allegedly talking on a busy street. But he quickly admitted that he created a deepfake of himself talking. Unfortunately, he says, the video wasn’t too realistic. So he also recorded the sound of a busy street. He asked AI to combine the two, and it created a seemingly perfect video. The more distraction you give a deepfake (in this case, the background noise), the more real it looks.
Currently, there are few, if any, AI-based validation tools to verify authenticity, and the audio recordings and videos are so compelling that even Katy Perry’s mother fell for a fake photo of her own daughter, he says.
It’s an issue that attorneys are aware of—but don’t know how to halt.
“I worry about this possibility overwhelming my colleagues, who are older,” says Sharon Silberman-Hummels, an attorney in Maine whose practice includes a majority of older attorneys. “One guy withdrew from a case because he didn’t know how to e-file. Their heads are going to explode.”
So what’s an attorney to do?
Follow along with the ABA Journal’s coverage of the ABA Techshow 2025 here.
Schlegel recommends asking specific questions when presented with the video, photo or voice recording: When did you take the photos? How? Do you still have the device?
If any of these answers are unknown or appear to be falsified, Schlegel says to pass on the case, or at least, to pass on that evidence.
It’s even affecting medical malpractice, as many hospitals now require AI to transcribe a patient’s stories. It happened, Schlegel says, to his wife. She went to the doctor on a whim after the visit, reviewed her medical records only to notice that the summary of her story was incorrect: AI had hallucinated a few details. This could affect insurance claims and health outcomes going forward.
From a legal perspective, Schlegel notes that it creates doubt about the veracity of the evidence. “How much weight can we give to medical records?” he said.
Still, there are a few positives that can be attributed to deepfakes, Schlegel says. They’re great for informational videos, training videos and other similar tools.
“This is the brave new world,” he says, shuddering.
See also:
Is the legal system ready for AI-generated deepfake videos?
The End of Reality? How to combat deepfakes in our legal system
Write a letter to the editor, share a story tip or update, or report an error.