The Moral Choice on Self Driving Vehicles

In March 2018, a self-driving Uber Volvo XC90 operating in autonomous mode struck and killed a woman named Elaine Herzberg in Tempe, Arizona. The crash raised a number of suddenly pressing questions about testing autonomous vehicles on public roads.

Actually, everytime a driver slams on the brakes to avoid hitting a pedestrian crossing the road illegally, he is actually taking a moral decision that shifts risk from the pedestrian to the people in his horseless carriage.

Self-driving cars might soon have to make such ethical judgments on their own. But pursuant to a remarkably large survey on machine ethics, which was recently published inĀ Nature, setting a universal moral code for theseĀ  vehicles might be not easier than sailing rough seas.