Kill oncoming pedestrians or turn steering wheel to sacrifice yourself? Dilemma of self-driving cars programme
Imagine getting a lift in a self-driving car that is driving instead of you. You feel comfortable in the car, having your morning coffee, and you enjoy looking around without having to monitor the traffic and drive. But your idyllic moment is interrupted by the following situation when the software programme decides about your life.
Sitting inside a self-driving car you turn a corner
and are on course for an unavoidable collision. There are 10
people in front of the car and solid walls on both sides. Should
the car swerve to the side into one of the walls? In this scenario
you would be very likely seriously injured or killed by the crash.
But 10 people would be saved from being hit by your car. Or should
the car try everything to stop immediately, even though it would
very likely hit the people while keeping you safe?
A research team from the Toulouse School of Economics conducted a study in which 75% of respondents replied that the driver dying to save a group of people would be the moral solution. Most respondents would sacrifice themselves to decrease the number of victims. But what in the case there is a child on the back seat? Should the car also behave ‘morally’ and save more potential victims?
Similar situations with various scenarios somehow create a grey area in the question of moral dilemma and the legislation of using self-driving cars. Researchers and psychologists have started to confront this: they clearly point out that before self-driving cars start being used, people must know how cars will ‘behave’ in such situations.