If A Deadly Crash Is Unavoidable, Who Should A Driverless Car Save?

It's a moral dilemma that's proving to be a real sticking point.

In a deadly crash, driverless cars should sacrifice the few to save the many according to the majority of people questioned in a survey about autonomous driving.

The poll, carried out by Dr Iyad Rahwan from the Massachusetts Institute of Technology, showed some truly conflicting views around the concept of driverless cars.

Google has been trialling driverless cars in California for over a year now.
Google has been trialling driverless cars in California for over a year now.
Eric Risberg/AP

In a scenario where a crash was unavoidable, the public argued that the driver should be sacrificed if it would result in more lives being saved.

Yet many of those agreed that they would never want to travel in such a car in the first place.

Dr Rahwan explains: “Most people want to live in in a world where cars will minimise casualties, but everybody wants their own car to protect them at all costs.”

Autonomous vehicles are advancing at such a rate that they could eliminate up to 90 per cent of traffic accidents, however the ethical cost could be too high for many.

Autonomous vehicles would be the first real-world example of humans handing over life or death decision making to a non-organic entity.

Currently 'self-driving' cars like Tesla's Model S or even Google's completely autonomous vehicle are designed to assist a driver in making these decisions, not totally wrenching control from them.

Dr Rahwan's team investigated the moral dilemmas associated with self-driving cars by asking 2,000 people over a series of six online surveys.

The results, published in the journal Science, revealed a fundamental conflict of opinion.

While people put public safety first as a general rule, they did not want to risk their own lives or those of their loved ones in driverless cars programmed to make sacrifices.

See Also:
Driverless Cars To Be Tested On UK Motorways Next Year In Trial Set To Revolutionise Motoring By 2020
Google’s Driverless Cars Have Been Involved In 11 Accidents
Self-Driving Cars: Artificial Intelligence Holds The Key To An Autonomous World

One survey found that 76% of those questioned thought it would be more moral for an AV to sacrifice one passenger rather than kill 10 pedestrians.

At the same time, there was a strong reluctance to own or use autonomous vehicles programmed to avoid pedestrians at the expense of their own occupants.

One question asked respondents to rate the morality of a driverless car capable of crashing and killing its own passenger to save 10 pedestrians.

The rating dropped by a third when people considered the possibility of being the sacrificial victim.

Participants were also strongly opposed to the idea of government regulation of driverless cars to ensure they are programmed with utilitarian principles - or the “greater good” - in mind.

Writing in the same journal, psychologist Professor Joshua Greene, from Harvard University, said the design of ethical autonomous machines was “one of the thorniest challenges in artificial intelligence today”.

He added: “Life-and-death trade-offs are unpleasant, and no matter which ethical principles autonomous vehicles adopt, they will be open to compelling criticisms.

“Manufacturers of utilitarian cars will be criticised for their willingness to kill their own passengers. Manufacturers of cars that privilege their own passengers will be criticised for devaluing the lives of others and their willingness to cause additional deaths. “

Close

What's Hot