Self-Driving Car Dilemma: Artificial Intelligence May Be Forced To Choose Between Saving Passengers Or Pedestrians


Self-driving cars may be the new direction for the automotive industry, even though realistically it could be decades before we start seeing them all around us. That isn’t stopping some from pondering a moral dilemma these cars may pose.

When you take the control away from the driver, you are giving the motor vehicle power to make judgment calls. In many ways, you give a machine the power to choose your fate in specific scenarios. This is the very thing physicist Stephen Hawking warned us about, as he believes artificial intelligence could lead to a very real Terminator scenario.

Of course, that scenario depends on the machine becoming self-aware and deciding that humans are expendable.

The scenario that has come to the awareness of the public recently was one where the car “sees” more potential fatalities outside itself. The self-driving car then has to choose whether to save its passengers or those pedestrians. For some, the answer is easy, explained by Leonard Nimoy in the second Star Trek film.

“The needs of the many outweigh the needs of the few or the one.”

According to a recent survey, that is the most popular opinion — minimizing casualties or lives lost should be the priority. Although the question is based on a rare scenario, the ethical question still leaves the answer up to debate.

Dr. Iyad Rahwan, from the Massachusetts Institute of Technology Media Lab, created the survey to see which side people stood on.

“Most people want to live in in a world where cars will [minimize] casualties, but everybody wants their own car to protect them at all costs.”

This is the biggest moral issue facing self-driving cars, and either choice will create a negative consequence. It might be one of the biggest reasons why it could be decades before the driverless cars end up going mainstream. As long as the driver is responsible and aware of their surroundings, it might be safer to keep control in the hands of the human being.

However, if the person has been drinking and has no options to have someone else drive, the self-driving car could be the lesser of two evils.

The technology is already in place in Maryland just outside Washington, D.C., with a 3D-printed car you can hail with a smartphone app. It hasn’t been in operation long enough to offer comprehensive statistics, but it does already exist.

Google has been using the same technology to travel the world and take photos for use in Google Street View.

Other manufacturers are working on driverless cars that also fly, and the implications only rise further with that concept. Not only would the car be able to control itself, but there would need to be air restrictions in place, keeping it from becoming a flying death-trap. You can imagine insurance costs would be astronomical for something like that.

Harvard University Professor Joshua Greene says there is no right or wrong answer to the moral question surrounding self-driving cars.

“Life-and-death trade-offs are unpleasant, and no matter which ethical principles autonomous vehicles adopt, they will be open to compelling criticisms.

“Manufacturers of utilitarian cars will be [criticized] for their willingness to kill their own passengers. Manufacturers of cars that privilege their own passengers will be [criticized] for devaluing the lives of others and their willingness to cause additional deaths.”

What do you think? Are the passengers more important than pedestrians if the driverless car has to choose between the two?

[Image via Chesky/Shutterstock.com]

Share this article: Self-Driving Car Dilemma: Artificial Intelligence May Be Forced To Choose Between Saving Passengers Or Pedestrians
More from Inquisitr