Law in Popular Culture

Law prof ponders: If a highly advanced robot kills, is it murder or product liability?

  • Print.

We aren’t there yet. But a human-likerobot portrayed in the short story “Mika Model” could eventually be developed.

And, if so, the central question of the story also could come to life, says law professor Ryan Calo of the University of Washington. That is, how should the legal system treat a flesh-and-blood robot with a computer for a brain who kills her owner after claimed abuse, then pleads for a criminal defense lawyer?

Should Mika be criminally charged? Put in a holding cell while prosecutors figure out what to do? Or is this a product liability issue? None of these questions has an easy answer, Calo writes in a Future Tense article published by Slate.

Mens rea is needed in a murder case—and presumably lacking because Mika isn’t human. Yet her imitation of life is so convincing—including apparent actual suffering from claimed torture by her owner—that it’s hard to treat Mika simply as a machine.

Meanwhile, from a product liability standpoint, difficult questions also arise. Did the owner misuse the robot by abusing it? Was his death foreseeable? If this was the first such slaying, those responsible for putting Mika Model in his hands may have a better argument against being held accountable, writes Calo, who says he intends to use the short story in some of his law classes.

In the story, corporate counsel for the company that sold Mika to her now-deceased owner says Executive Pleasures contracts and policies cover all the contingencies. “It’s better if you don’t anthropomorphize,” the in-house lawyer tells Detective Rivera, who is trying to deal with the situation. A few moments earlier, the attorney had brutally destroyed the robot’s central processing unit.

Future Tense, a collaborative venture by Arizona State University, New America and Slate, commissioned the short story by award-winning sci-fi writer Paolo Bacigalupi.

Give us feedback, share a story tip or update, or report an error.