'Westworld' story lines connect with real-world legal issues
Evan Rachel Wood and James Marsden as “hosts” Dolores Abernathy and Teddy Flood on “Westworld.” Photo from Facebook.
A fictional theme park with a wild west theme is the setting for the HBO series Westworld, where guests pay to live out their fantasies. Those familiar with the darker side of the human condition won’t be surprised to learn that these fantasies often entail patrons killing as many of the amusement park hosts as they please.
You see, there are no recriminations, as the “hosts” are technologically advanced androids programmed to allow the high-paying guests to indulge in what would otherwise be criminal behavior outside of the amusement park. There’s a curious side note to the series though (no, not the murderous or pathological indulgences of the guests): Many of the issues explored throughout each episode have parallels to recent “real world” headline-news stories.
As hosts are killed and otherwise maimed and abused throughout the series, the writers and producers have the opportunity to delve into increasingly relevant notions regarding artificial intelligence and personal privacy. What are the legal implications if a host injures or kills a guest? How much personal data about their guests can the operators of the amusement park collect and retain?
ROBOTS HURTING PEOPLE: PEDESTRIAN KILLED BY AUTONOMOUS VEHICLE
The fictional terms of service states that guests entering Westworld must agree to include a clause protecting the company that owns the park from liability if and when a guest is injured or killed by a malfunctioning host. However, the intersection at Mill Avenue and Curry Road in Tempe, Arizona, is not part of a Westworld set. It’s a real intersection where a pedestrian was killed as she was attempting to cross the street. What makes this tragic event pertinent to a discussion about artificial intelligence and the legal rights and obligations of robots and androids, you ask? The machine that killed the pedestrian was a self-driving Uber vehicle.
The deadly collision happened a year after another autonomous vehicle being tested by Uber (also in Tempe) was involved in a collision with a separate automobile. Police determined the human driver of the automobile was at fault in causing the accident, so the issue of fault and responsibility when driverless vehicles injure people or cause damage did not arise. Still, it’s only a matter of time before the issue will sadly occur again, and sooner or later the liability aspect will have to be litigated.
Some testing of self-driving vehicles on public streets and highways involves a person in the vehicle who can take over in case of a system malfunction. That is not always the case, though, as many autonomous vehicles operate without a driver in the “backup” position. The ultimate goal is to take the human element away once the vehicles are produced and sold to the general public. This poses a potentially huge issue for personal injury lawyers, because the law of negligence is inherently dependent on a human element.
MAKING AUTOMAKERS LIABLE
The solution to holding someone responsible when a robotic vehicle causes injuries might be found in the law of product liability, though. I recently wrote in regards to the popular NBC series “This is Us” as it relates to liability. If the manufacturer of the slow cooker that killed Jack Pearson can potentially be held liable for selling a product that failed to operate safely, legislation could most definitely make the manufacturer of an autonomous vehicle liable when accidents occur.
Some states have already enacted legislation making the manufacturers of autonomous vehicles liable for damages from accidents caused by such vehicles. This may be the complete opposite position taken by the owners of Westworld in their terms of service, but it is consistent with the position taken by some other manufacturers. Volvo, Mercedes-Benz and other driverless-vehicle manufacturers have announced they will accept responsibility for damages caused by their vehicles when in autonomous operation.
And it makes sense, doesn’t it? If a company creates the intelligence contained in the chassis, shouldn’t that company be responsible for what it has created? It’s the classic “Dr. Frankenstein” argument. If you create something, you should be responsible for your creation. The pursuit of knowledge is dangerous, and those who take on that task must be willing to carry the potential burdens of those creations…they must bear the consequences and ramifications that result.
A clause in the Westworld terms of service should also alert viewers to the recent news reports involving data mining and Facebook. Under the terms of service in the HBO series, the fictional company operating the park, Delos Destinations, has the right to collect, control, and use guest saliva, blood and other “secretions and excretions.” If that sounds fairly ominous, it’s because it most certainly is.
Facebook’s terms of service (thankfully) pale in comparison to those of Delos as far as what the company can and can’t do with the personal data it collects on its users. Regardless, recent news reports about Cambridge Analytica collecting data about Facebook users is definitely disturbing. Cambridge was apparently able to circumvent Facebook controls to collect and use data on 50 million users of the social media website. The data was used to launch a campaign designed to influence their thoughts and perceptions about issues and candidates in the 2016 presidential election in the U.S.
Facebook, to its credit, has announced sweeping changes in its operations in hopes of providing better protection for the information it collects and retains on its users. But Facebook is not alone in collecting personal data from its users. According to a survey by Pew Research Center, more than 80 percent of people who admit to using social media platforms are concerned about how their personal data is being used. Even in the face of these concerns, individuals still do not appear ready to give up their social media accounts. That same survey reveals that more than 69 percent of adults in the U.S. make use of social media platforms on a regular basis.
So, how accurately have the writers and producers at HBO captured human nature? The analogy is frightful to say the least. Many individuals use social media simply to connect with friends and family, but far too many people employ their accounts to find some solace, some escape from the hindrances of reality. The enticement of living out one’s fantasies—the ability to disconnect from the real world and be someone else—makes guests at Westworld give away rights to their DNA and bodily fluids. They subject themselves to harm and potential death. It’s not a stretch to say the situation mirrors the lure of social media platforms and the willingness of people to share personal data even while having reservations about the platforms’ ability to protect their information—and potentially more.
Adam R. Banner is the founder and lead attorney at the Oklahoma Legal Group, a criminal defense law firm in Oklahoma City. Mr. Banner’s practice focuses solely on state and federal criminal defense. He represents the accused against allegations of sex crimes, violent crimes, drug crimes, and white collar crimes.
The study of law isn’t for everyone, yet its practice and procedure seems to permeate pop culture at an increasing rate. This column is about the intersection of law and pop culture in an attempt to separate the real from the ridiculous.