The judicial system needs to learn how to combat the threat of 'deepfake' evidence
Lincoln Mead. Photo by Saverio Truglia.
"Deepfakes" of politicians and celebrities have become commonplace, and the justice system needs to adapt in the face of fake content that often targets women.
That was the central theme of an ABA Techshow panel Friday at the Hyatt Regency Chicago. During the panel, titled “Red Pill vs. Blue Pill: How Deepfakes are Defining Digital Reality,” Sharon Nelson of the technology firm Sensei Enterprises spoke about the evidentiary challenges facing lawyers in an era where it can be problematic figuring out what is real and what is not.
Artificial intelligence-generated synthetic videos, text and audio—also known as deepfakes—commonly target celebrities and politicians. But Nelson noted the technology might have the most significant impact on women in revenge porn cases.
“It’s a huge problem,” Nelson told an audience of about 50 people. “We’re really not doing a good job of corralling, nor are we doing a good job with the deepfakes made to attack an ex-lover and ex-wife.”
Pornographic content accounts for 96% of deepfake content online, according to a report by the company Deeptrace, which is developing tools to unmask fake content.
Virginia and California have enacted laws against deepfakes, Nelson said. But she said more needs to be done to ensure that people’s images are not being used without their consent.
“We’ve got a lot of work to do with protecting the rights of women and men who have been subjected to some of this,” she said.
Follow along with the ABA Journal’s coverage of the ABA Techshow 2020 here.
Nelson was joined on the panel by Lincoln Mead, a project manager with Canon Business Process Services. Mead said one of the challenges facing courts is discerning between what is real and what is not—and proving it.
Several tools are under development to unmask deepfakes, he said, including tools created or sponsored by the University of California at Berkeley and Google.
“These are tools or items that are being used to help evaluate the authenticity of a given video,” Mead said.
Criminals have seized on deepfakes to commit fraud, Nelson said. She cited the case of an energy company executive in the United Kingdom who had paid 220,000 euros to someone he believed was the boss of his parent company. It turned out the fraudsters had used deepfake audio to dupe him.
Nelson also talked about a case where manipulated audio was used in a custody battle to make it appear that a father had used threatening language against another party in the dispute. Nelson said that the woman had manipulated the audio using a cheap app.
Deepfakes pose challenges for the legal community, Nelson said in an interview Thursday, and it’s going to take some time to educate judges and the legal profession. She said even though deepfakes can often be funny, they can also be upsetting or disturbing.
“I believe in civics, and I believe in the rule of law,” Nelson said during the panel. “And deepfakes, if nothing else, do threaten the rule of law, because people no longer know what the truth is.”