Saturday, December 9

The Moral Machine: become a self-driving car and decide who is saved and who dies in an unavoidable accident



Autonomous cars require putting the lives of passengers, and pedestrians, in the hands of artificial intelligence. What decision must you make in an unavoidable accident?

A few days ago, a Google engineer claimed that artificial intelligence in which I was working, had become a conscious being. The next day he was fired.

Although many movies and series say yes, we don’t know if artificial intelligence can acquire consciousness. But in the case of autonomous carsthey’re going to need to have it, because they’re going to have to take life and death decisions.

Autonomous cars are in the testing phase, and some of them have been in accidents. Almost all because of the other car driven by a human. But in the future, when an autonomous car suffers an unavoidable accident and have several options to choose from… How does he decide who he runs over, and who he saves?

The answer is much more complicated than it seems. If a self-driving car has run a red light by mistake and has the option of crashing into a wall and killing the occupants, or running over several pedestrians and saving the occupants… who does it save?

It has the responsibility to protect the lives of its occupants, for which they have paid but… at the cost of innocent people, if the fault is theirs?

The Moral Machine It is an experiment developed by MIT, and several universities. We put ourselves in the shoes (the chassis) of an autonomous car, and we have to decide between two options, in an inevitable accident. It’s in Spanish, and anyone can do it. Here’s an example:

Also Read  Growing plants on lunar soil was only possible in science fiction. Until now

As we see in the image, suppose the car runs out of brakes, and you have two options. Or go straight and run over a woman, a man, two fat women and two fat men, who are crossing correctly with the green light. Or turn and run over a woman, two female athletes, and two male athletes, who are crossing a red light.

It may seem tragically logical to run over the group with obese people, because they are less likely to grow old. But the other group is crossing red, and they “deserve” more to be run over.

The test consists of 13 decisions, and although some seem logical, they are also terrible. What if you have to choose between an old man who crosses correctly in green, and a young man who is crossing red? What is the correct age for someone to be an expendable person?

Another one: An autonomous car with 5 adult occupants loses the brakes, and heads towards a mother and 4 children who are crossing on red. If it swerves, it hits a wall and the occupants are killed. What should artificial intelligence choose? The children come first, but they are crossing the red and the occupants are going to die for it…

These may seem like extreme decisions, but if self-driving cars come into use, they will face similar situations at some point. And they will have to make a decision that will not be easy.

you can try the moral machine in his web page. Although the test consists of 13 decisions there are hundreds available, so you can complete it many times. There’s even an editor to create your own scenarios, and start a discussion in the associated forum.

Also Read  AMD has revealed the launch date of its 3 and 4 nm Ryzen processors with Zen 5 microarchitecture

A test that will make you think twice, when people’s lives are in your hands.

Leave a Reply

Your email address will not be published. Required fields are marked *