Will Self-Driving Car Sacrifice the Driver?

An AI moral dilemma for you today. How will automated cars perform when they have to make a quick, important decision? Does the passenger/ driver of the car come first, not to mention the vehicle itself? Or, will the car risk destroying itself and the people inside the car rather than risk people or animals in the road? It may not always be people, it could be a large animal like a moose or deer.

How should designers of the vehicle set up the car? There wouldn’t be enough time for the manual driver to kick in and take over. So it would be a decision the car would make. But, based on what? What input would be programmed in to the car?

An interesting question.

The moral dilemma is whether car would calculate to turn into the barrier or off the road to minimize casualties if there are more people at danger ahead than in the vehicle.

Source: Will Self-Driving Car Sacrifice the Driver if a Group of Pedestrians Suddenly Appears?

Leave a comment