Should robots be civilly liable for damage they cause? EU may consider the idea
The European Union may need to revise its rules of civil liability to address damages caused by autonomous robots, according to a report endorsed last week by a committee of the European Parliament.
The report raises the idea of creating a legal status for robots as they become more sophisticated in the future, the Wall Street Journal Law Blog reports. The report says the robots could be given “the status of electronic persons with specific rights and obligations, including that of making good any damage they may cause.”
When robots of the future make autonomous decisions or interact independently with third parties, the idea of “electronic personality” could be applied to cases springing from the robots’ actions, the report says.
“The more autonomous robots are, the less they can be considered simple tools in the hands of other actors,” the report says. “This, in turn, makes the ordinary rules on liability insufficient and calls for new rules which focus on how a machine can be held—partly or entirely—responsible for its acts or omissions.”
The report also recommends that all robots have a kill switch that would shut them down, if necessary, according to BBC News. And the report says robots should be programmed not to harm humans, and to obey human orders unless they would conflict with the rule against harming humans—taking its inspiration from science-fiction author Isaac Asimov’s Three Laws of Robotics.
Updated at 1:21 p.m. to reference Isaac Asimov’s Three Laws of Robotics.