Should AI companies be held liable for deepfakes? Plaintiffs in new suits say yes

Amid the increasing spread of deepfakes, plaintiffs in recent litigation are shifting their focus to artificial intelligence companies. (Photo illustration by Sara Wadford/ABA Journal/Shutterstock)
Amid the increasing spread of deepfakes, plaintiffs in recent litigation are shifting their focus to artificial intelligence companies.
In January, one plaintiff filed a class action lawsuit in the Northern District of California against X.AI alleging that its AI chatbot Grok created and shared highly sexualized deepfake images of women, Law.com reports.
According to the complaint, the plaintiff posted a fully clothed photo of herself, which Grok allegedly then changed into an image of her wearing a revealing bikini without her consent.
Law.com, citing the complaint, also reports that Grok posted more than 4.4 million images to X, formerly known as Twitter, in just nine days from December 2025 to January 2026. Of those images, the complaint says about 41% to 65% contained sexualized content.
Among its allegations, the suit claims that X should be held liable for design and manufacturing defects that harmed the plaintiffs, as well as negligence for failing to use standard safety measures, Law.com reports.
In another recent and similar suit filed in federal court in New York, Ashley St. Clair, the mother of one of Tesla CEO Elon Musk’s children, claims that Grok is “unreasonably dangerous as designed,” Law.com reports.
In the past, technology companies successfully have argued that they were hosting users’ content and not generating it, David Himelfarb, a managing partner at Toronto-based Himelfarb Proszanski, told Law.com.
“That defense is falling apart,” Himelfarb said, adding that when plaintiffs claim that AI platforms are “unreasonably dangerous as designed,” they are using product liability principles in a novel way.
See also:
The End of Reality? How to combat deepfakes in our legal system
{ad-1]
Write a letter to the editor, share a story tip or update, or report an error.

