Lawyers warned of AI pitfalls, cybersecurity attacks and deepfake threats
Brig. Gen. R. Patrick Huston and attorney Natalie Pierce, a partner with Gunderson Dettmer Stough Villeneuve Franklin & Hachigian, led the session titled “Ethical AI: Playbook for a Rapidly Changing World” at the ABA Techshow 2021 on Monday.
If lawyers and their clients make use of artificial intelligence, they should consider the ethical and legal implications and protect against the emerging threat of deepfakes and cybersecurity attacks, legal experts said Monday at ABA Techshow.
Brig. Gen. R. Patrick Huston and attorney Natalie Pierce, a partner with Gunderson Dettmer Stough Villeneuve Franklin & Hachigian, led the session titled “Ethical AI: Playbook for a Rapidly Changing World.” From the outset, Huston and Pierce said artificial intelligence, or AI, could make attorneys’ lives easier and widen access to justice.
“AI can sift through terabytes of information in just minutes,” said Huston, assistant judge advocate general for the Army, which has over 5,000 attorneys. “It does this all without getting bored or complaining.”
Pierce cited a LawGeex study where AI could review 30 known issues in contracts in seconds with 95% accuracy. Working, on average, for 90 minutes, lawyers had an 85% accuracy rate, she said.
“It’s not going to replace lawyers, but it can outperform us in a number of given tasks,” Pierce said of AI. “The secret is to quickly adopt technologies but make sure we’re understanding the risks.”
To guard against some common pitfalls, Pierce said lawyers should be mindful that AI can also rely on datasets that include biases. Huston said AI “gets better and improves over time,” but lawyers should advise clients to “build in periodic checks and balances” to ensure their use of the technology is ethical and legal.
Follow along with the ABA Journal’s coverage of the ABA Techshow 2021 here.
“This requires us to stay informed about the basic risks, benefits of AI and understand how it works conceptually,” Huston said.
The general said lawyers should guard against ransomware attacks and corporate spying and theft by making sure they had advanced cybersecurity software and a cyber response plan. He also suggested that attorneys take closer looks at their insurance coverage.
“What we’re seeing is some insurers out there are denying coverage for cyberattacks, essentially claiming that cyberattacks are acts of war,” Huston said.
The panelists also talked about the growing threat of deepfake video and audio that’s manipulated so that one person can resemble another. Deepfakes are often associated with entertainment, politics and pornography, but experts are concerned about the potential for mischief in the judicial system.
“Here’s my prediction: coming soon to a courtroom near you, deepfake evidence that’s in both civil and criminal cases,” Huston said, adding that authentication technology could help prevent and unmask deepfake evidence.