With the reality of a Google car just a few years away, people are looking forward to kicking back and letting robotics handle road rage. While the fantasy is a pleasant one, self-driving cars bring upon a philosophical dilemma.
GOOGLE SELF-DRIVING CAR CHOOSES TO KILL PEOPLE
A new research study indicates that passengers in a self-driving car are concerned with safety but in a selfish way. They want the autonomous car to put the safety of passengers first above all else. If that means choosing to kill a pedestrian versus crashing into a gas station, then so be it. The future could be an interesting one. Imagine Google cars going to trial for vehicular manslaughter. All the while, the innocent passenger takes the stand as a witness. Who will be to blame in an autonomous future?
GOOGLE SELF-DRIVING CARS FACE MORALITY ISSUES
As autonomous cars come closer to reality, businesses are asking philosophical questions. Its possible manufacturers will create cars with various levels of programmed morality depending on what the consumer wants. Maybe even the government will step in, mandating that all self-driving cars share the same value of protecting the greater good. However, who gets to decide what that greater good is? That seems like a question hard enough for humans to answer let alone a machine.
TECHNOLOGICAL RABBIT HOLE OPENED BY GOOGLE SELF-DRIVING CARS
These new researchers have opened up a complicated rabbit hole that will be hard to get out of. It’s going to take years for answers to formulate and one can expect a great deal of controversy in the process. Let’s say Google creates an array of autonomous cars with various morality settings. Since the consumer gets to choose these morals, will they become responsible for its actions? Additionally, the U.S military is dealing with the same problems as well. Armed drones are on the brink of making their own killing decisions. If a drone happens to make the wrong one, finding the blame will be a complex tasking. Until these questions are answered, autonomous technology remains in a philosophical limbo. Who would have thought the movieĀ iRobot would have accurately predicted the future?