Legislation and Lobbying

The dangers of digital things: Self-driving cars steer proposed laws on robotics and automation

  •  
  •  
  •  
  •  
  • Print.

Woman Smiling

Melinda Giftos: “It’s difficult to foresee a standard under which drivers in fully automated cars and manufacturers would share liability.” Photograph by Husch Blackwell.

LIABILITY IN THE FAST LANE

Arruda of Ross Intelligence says cybersecurity is no more and no less of a problem for self-driving cars than it is for any other industry that relies on networks and computers.

“Of course, there should be standards, and I will be surprised if the government isn’t already working on them,” he says. “Cybersecurity will always be a threat. It is par for the course. It’s a constant battle between white hat and black hat programmers. It will never disappear as long as we have computers, but I don’t think anyone wants to go back to a time before them.”

When it comes to liability, meanwhile, the law seems unsettled. Perhaps that’s because lawyers, car manufacturers and designers aren’t sure who is at fault when an autonomous vehicle crashes.

Musk of Tesla made headlines in October 2016 when he declared his company would not be liable if one of its driverless cars got into a crash. When a Florida driver died that May while using the autopilot function, Tesla and investigators maintained that the driver was at fault because he was warned by the system several times to put his hands back on the steering wheel.

In September, however, the National Transportation Safety Board announced “the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.” A Tesla spokeswoman said the company would “evaluate [the NTSB’s] recommendations as we continue to evolve our technology.”

Volvo, on the other hand, said it would accept liability for its autonomous vehicles.

A 2016 RAND Corp. study predicted a manufacturer’s liability would increase with personal liability decreasing. This might stifle innovation, the study argued, so some states might move to limit a manufacturer’s liability or shield it from lawsuits. Michigan maintains that manufacturers can’t be sued if a third party modified its driverless car without the manufacturer’s permission.

Melinda Giftos, managing partner at the Madison, Wisconsin, office of Husch Blackwell and co-leader of the firm’s internet of things group, says it’s difficult to foresee a standard in which drivers in fully automated cars and the manufacturer would share liability. If the car does not operate properly, it is not reasonable to expect a human to make a split-second decision to correct it.

“There’s more hype right now for driverless cars than anything else,” says Giftos, adding that she was supposed to inspect one car in Madison, but it never arrived because it got into a crash. “There are a lot of kinks to work out. But I figure once they get it all figured out, the cars will be extremely safe.”

Walters argues the cost should be borne by the person with the lowest-cost opportunity to fix the problem. However, he also admits that there are no easy answers.

“If there’s an accident or worse, who is responsible? Is it the auto manufacturer? The software designers? The person who owns the car? The person who started up the car? It’s a simple question if you have someone sitting at the wheel and driving. It’s much harder when there’s no one. It could be a lot of people. It could be no one,” he says.

One of the few bills in Congress that has bipartisan support involves driverless cars. Late last July, the House’s Committee on Energy and Commerce unanimously approved a bill—sponsored by two Republicans and two Democrats—that would allow states to retain jurisdiction over traditional responsibilities, including insurance and registration; manufacturers to deploy a limited number of vehicles that do not currently meet auto standards; and the federal government to prohibit states from passing certain rules relating to autonomous vehicles. In September, the full House approved the bill on a voice vote.

The following month, the Senate’s Committee on Commerce, Science and Transportation approved a bipartisan bill that would waive traditional automobile regulations, such as those governing a steering wheel or brake pedals, in the interest of speeding up deployment of driverless cars. The bill, which has not been taken up by the full Senate, does not apply to trucks or buses.

robot assisted surgery

Manufacturers can take a cue from their counterparts in the medical-device industry, says Melinda Giftos of Husch Blackwell. Shutterstock.

NOT JUST CARS

Outside transportation, automation could play a larger role in our everyday lives. The International Bar Association has estimated that about one-third of graduate-level jobs around the world eventually will be performed by robots.

“The faster the process of the division of labor and the more single working or process steps can be described in detail, the sooner employees can be replaced by intelligent algorithms,” the IBA wrote in its April 2017 report. “Individual jobs will disappear completely, and new types of jobs will come into being.” The report predicted that the use of machines will obviate the need for outsourcing or offshoring, and production costs will decrease.

The IBA reported that governments would have to decide whether and how to protect human workers. One possible way is for the government to step in and declare certain sectors to be off-limits to robots, such as child care or certain types of weapons systems. Governments might also “introduce a kind of ‘human quota’ in any sector,” and decide “whether it intends to introduce a ‘made by humans’ label or tax the use of machines.”

Tully of Akerman says many areas of the law will require updating and changing. For example, artificial intelligence already can make predictions about a wide variety of things, including suggesting routes on GPS devices; recommending music, movies and books based on individuals’ likes and dislikes; and evaluating the strength of a potential lawsuit. As predictive tools start to make decisions regarding hiring and firing, Tully predicts there will an uptick in discrimination lawsuits.

“Whether or not these suits are successful will depend on what the law says, as well as what kinds of data the AI is looking at,” Tully says. “When things change that rapidly, we tend to relate things back to what we’re familiar with.”

The legal industry, meanwhile, has fought a skirmish with do-it-yourself legal services providers such as LegalZoom and Rocket Lawyer, accusing them of unauthorized practice of law. Tully predicts the increased use of legal chatbots will lead to yet another theater in the ongoing war. “There will always be a contingency that doesn’t want technology,” he says.

Meanwhile, Stephen Reynolds, a partner at Ice Miller in its Indianapolis office, says there is an existing federal framework for regulating the internet of things. For example, companies or individuals using big data and machines to make decisions about credit worthiness, employment or housing still have to comply with federal anti-discrimination laws.

As for the myriad smart devices, appliances, services and goods, Reynolds argues that the Federal Trade Commission has broad authority to regulate these devices, and there might not be a need for additional legislation.

Giftos of Husch Blackwell thinks manufacturers can take a cue from their counterparts in the medical-device industry. Smart medical devices such as Google’s smart contact lenses that can measure glucose levels through a user’s tears and smartwatches used to monitor people who have sleep apnea have access to tons of confidential health information that they are required by law to protect.

“The medical industry is definitely developing products that are more secure,” Giftos says. “They have to be more secure—that’s the nature of their industry. So they have the right framework in place. It’s the same with the financial industry.”

Woman Smiling

Dominique Shelton: The private sector is “already incentivized to have this conversation about privacy and cybersecurity.”

Dominique Shelton, a partner at Alston & Bird’s Los Angeles office, thinks the private sector in general has an overriding incentive to secure its data. “There’s already a focus from companies on privacy and cybersecurity in light of everything we’ve read in the news,” says Shelton, adding that international companies have newly enacted laws, such as China’s cybersecurity law and the European Union’s General Data Protection Regulation with which to comply. “When a company experiences catastrophic data loss, it leads to CEOs resigning. As such, they are already incentivized to have this conversation about privacy and cybersecurity.”

But Walters argues that there has to be more clarity in terms of what information smart devices track and what they share. “Smart speaker systems such as Google Home and Amazon Alexa or Echo, in particular, should disclose what they share,” he says. “In addition, we need stronger Fourth Amendment protections for information gathered by these devices.”

Amazon Echo


This issue has already reared its head, as prosecutors in Arkansas tried to retrieve information from an Amazon Echo found near a dead body. Amazon refused to turn over the data, citing privacy concerns. The defendant, James Bates, eventually relinquished the information (and the case was dismissed in November).

Walters says under the third-party doctrine, there’s no protection of personal information when shared with third parties. “People would be surprised to know that information picked up and shared by these devices might be obtained without a search warrant,” he says. “With more sensors and more people trading off privacy for convenience, we’ll need a different kind of Fourth Amendment to keep people ‘secure in their persons, houses, papers and effects.’ ”


This article was published in the March 2018 issue of the ABA Journal with the title "The Dangers of Digi-things: Writing the laws for when driverless cars (or other computerized products) take a wrong turn."

Give us feedback, share a story tip or update, or report an error.