Researchers discuss self-driving car knob settings for ethical choice

Google self-driving car
The finalized prototype of Google self-driving car. Credit: Google

(Tech Xplore)—Learning what the technology will do on your driverless car of the future is not the most daunting task to think about. The really difficult question is the what-if in any scenario where the car would need to sacrifice the people in the car or the people in the street in some unavoidable and serious accident.

Abigail Beall in New Scientist said this is one of the major problems confronting manufacturers—the moral decisions.

If you are not in a , the answer lies with you and your ethics. If you are in an autonomous vehicle, though, the question appears to rest with your car, which has no ethics, only the work of its engineers. Though there is yet another option being suggested for such questions—and that is thanks to a team from Italy. They have considered a way to put the decision in the hands of the human passenger in the AV.

Their suggestion is in the form of a knob. Their paper is titled "The Ethical Knob: Ethically-Customisable Automated Vehicles and the Law." Their study was published in Artificial Intelligence and Law. The authors are Giuseppe Contissa, Francesca Lagioia and Giovanni Sartor.

"We wanted to explore what would happen if the control and the responsibility for a car's actions were given back to the driver," said Guiseppe Contissa at the University of Bologna in Italy, in New Scientist.

It has been argued, they noted, that self-driving cars should be equipped with pre-programmed approaches to the choice of what lives to sacrifice when losses are inevitable.

"Here we shall explore a different approach, namely, giving the user/passenger the task (and burden) of deciding what ethical approach should be taken by AVs in unavoidable accident scenarios. We thus assume that AVs are equipped with what we call an 'Ethical Knob.'"

"The knob tells an the value that the driver gives to his or her life relative to the lives of others," said Contissa in New Scientist. "The car would use this information to calculate the actions it will execute."

How the knob would provide an answer: It would offer settings. Egoistic would mean preference for the passenger and Altruistic, for third parties. A third setting would be for impartiality, where the setting would allow the car to act in a utilitarian way with equal importance given to passenger(s) and third parties.

Cheyenne MacDonald in Daily Mail: "With a so-called 'ethical knob,' riders could tune a car's settings so it operates as 'full altruist,' 'full egoist,' or 'impartial' – allowing it to decide based on the way you value your own life relative to others."

Beall, meanwhile, quoted Edmond Awad of the MIT Media Lab, who is a researcher on the Moral Machine project: "It is too early to decide whether this would be a good solution," Awad said, but Beall added that he welcomed a new idea in an otherwise thorny debate. Moral Machine describes itself as a platform for gathering a perspective on moral decisions made by machine intelligence.

More information: The Ethical Knob: ethically-customisable automated vehicles and the law, Artificial Intelligence and Law, link.springer.com/article/10.1 … 07/s10506-017-9211-z

© 2017 Tech Xplore

Citation: Researchers discuss self-driving car knob settings for ethical choice (2017, October 18) retrieved 19 March 2024 from https://techxplore.com/news/2017-10-discuss-self-driving-car-knob-ethical.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

When self-driving cars drive the ethical questions

19 shares

Feedback to editors