The dangers of digital things: Self-driving cars steer proposed laws on robotics and automation
In September 2016, the NHTSA, with the Department of Transportation, released a series of guidelines for best practices in developing, deploying and testing driverless cars, as well as a model state policy and a list of current and potential regulations. The guidelines were updated one year later, with the safety regulations simplified from a 15-point assessment to 12 points, according to Forbes. Another major update is planned for this year.
“Our basic premise is to stay out of the car,” Steudle says. “Let’s step back and let NHTSA do its job. When your car is certified to be able to go out on the road in the U.S., then it’s OK for Michigan.”
To that end, he is excited about the possibilities that autonomous vehicles present. In the spring, the University of Michigan plans to use driverless shuttles to take up to 15 students to and from class. Additionally, the state has approved a plan for driverless trucks to platoon while on public highways. Under current state law, large trucks have to stay at least 500 feet away from one another. With driverless technology, trucks can drive closer together, accelerating and braking at the same time as the lead vehicle, allowing them to save fuel and move faster.
“We’ve been doing testing with the U.S. Army on Interstate 69,” Steudle says. “If a trucking company wants to do this, they have to send us a plan and we’ll either approve it or not.”
According to Steudle, areas in which the state’s laws are silent—including liability, insurance and cybersecurity standards—will be dealt with later. “The important thing is that we put a framework in place to enable things to move forward,” he says.
Cybersecurity, insurance and liability are obviously important issues that have to be resolved, Steudle says. But if Michigan legislators had waited to include them in the bills, they’d still be waiting for the finished law.
To that end, he says, the state created the Michigan Council on Future Mobility to address issues arising from the driverless car industry—with a mandate to come up with specific recommendations by March 31 of every year. The council consists of people from the automobile technology, insurance and legal sectors, Steudle says.
Besides the aforementioned areas, they will consider how to apply the technology to help the elderly and people who have disabilities. They’ll also consider how to train students at technical schools to fix and maintain autonomous vehicles, he says.
Similarly, California has relied on a framework of letting experts come up with rules and regulations on issues related to driverless cars following passage of its statute in 2012. Home state to many Silicon Valley companies developing automated cars, California’s law tasked the state’s Department of Motor Vehicles to create rules and regulations concerning the testing and deployment of said vehicles. The officials had hoped to have them approved by the end of 2017. At press time, they were still being considered and awaiting approval by the end of February.
The state seems content to defer some issues. On liability, Brian Soublet, deputy director and chief counsel at the California DMV, says it will be for the state courts to decide. “I don’t think you can lay out those intangibles in advance,” Soublet says. “Is it a design defect if the car runs a red light and there’s an accident? Is it on the owner? I think those are issues that will play out in court.”
As for cybersecurity, the California DMV’s proposed rules obligate manufacturers to meet industry best practices and for cars to be able to detect, respond and alert the operator about cyberattacks and other breaches, giving the operator a chance to override automated technology.
“The industry has best practices, and cybersecurity is something car manufacturers and the government are taking seriously,” Soublet says. “You want to make sure that, at the very least, they are following some form of regime to address how to handle spurious commands coming into the vehicle.”
Some states are standing in a legal gray area. Pennsylvania, for example, is a training ground for Uber’s collaboration with Carnegie Mellon to deploy autonomous vehicles throughout Pittsburgh. At press time, Pennsylvania did not have a statute that speaks to the legality of driverless cars.
However, Roger Cohen, policy director at the Commonwealth of Pennsylvania Department of Transportation, says the state has long operated under the assumption that autonomous cars are allowed on public roadways—as long as a human driver is at the steering wheel ready to take over. PennDOT has taken the lead in promulgating policies relating to autonomous vehicles with the goal of their formal adoption into law.
“That policy was deemed to be a more effective tool for the public oversight of testing operations because of its ability to be flexible and nimble and rapid in responding to what are fast-moving, unpredictable, hard-to-anticipate new developments,” Cohen says.
As with Michigan, Cohen says time is of the essence, adding that although Pennsylvania’s regulatory structure has an important purpose, it generally takes one to two years to process feedback and review the rules. “That was deemed to be ineffective for emerging technology,” Cohen says.
Instead, PennDOT has been freed up to develop policies while collaborating with a wealth of stakeholders—including academics, sister agencies, lawyers, technology companies and members of the automotive industry. Cohen says bills are pending in both state legislative houses, and he is optimistic that they’ll be passed.
“When it comes to car accidents, we must drive down the death rate toward zero, which is our goal,” Cohen says. “We have a technology that gives us our best chance to do that. I think there are real issues concerning data ownership, data privacy and cybersecurity. But there’s every reason to be optimistic.”
GOING THE WRONG WAY?
Shamla Naidoo, global chief information security officer at IBM, has nothing but praise for Michigan’s framework. Calling on other states to look to Michigan as a model, Naidoo says the state has a vested interest in ensuring that car manufacturers build cars correctly, and that they are compliant with the law the moment they leave the lot. Additionally, she says, the state has existing laws relating to hacking and liability that can easily be applied to autonomous cars.
“I don’t know how many other states have such laws,” Naidoo says. “They cover pretty much the entire range of driverless cars. It’s very clear to me that this state has thought things through pretty well. It’s very valuable to have that kind of framework for all states.”
Walters of Fastcase, however, thinks legislators are not focusing on areas where they should. “It’s not surprising that there’s a lot of action,” Walters says. “My concern is it’s the wrong action.” He says that while companies like Google, Tesla and GM convey a certain standard of quality, as more companies get into this space, the standards could be all over the map.
“The kinds of things we aren’t deciding are things like: How good should the vehicle’s software be before it gets on the road?” Walters says. Also, how should cars respond to law enforcement? “As far as I know, there is no law in any state that requires a driverless car to pull over for police.”
He also cites cybersecurity as a major area in which there hasn’t been enough focus. “These systems are very new and, ultimately, they’re just a network of computers,” Walters says. “They have all of the vulnerabilities that computers do.”
Walters says state officials’ hands might be tied when it comes to cybersecurity. Two proposed driverless car bills in Congress contain language pre-empting state standards for self-driving cars.
“It’s hard to imagine that state officials would have the expertise to regulate something as technical as cybersecurity for autonomous cars, at least not yet,” he says. “It’s a delicate balance. I wouldn’t want to see 51 different standards for cybersecurity; on the other hand, we could benefit from states creating best practices on security.”
Riehl of risk management company Stroz Friedberg agrees, adding he hasn’t seen any laws that address the security of software relating to automated vehicles. “The difficulty is that we don’t even know what the biggest risks are, much less how to protect against those risks,” Riehl says. “The law could be obsolete before the ink is even dry.”
This article was published in the March 2018 issue of the ABA Journal with the title "The Dangers of Digi-things: Writing the laws for when driverless cars (or other computerized products) take a wrong turn."