Not long ago, self-driving cars were a science fiction fantasy - like in the 1980s TV show “Knight Rider” or in “Transformers”. Yet in the past two decades, there have been significant developments and advancement of such autonomous cars. However, like all new technology, these cars bring attention to serious moral questions that must be addressed.
There is no easy answer these questions, since morals are not consensually shared throughout all cultures. After discussing a few situations from the MIT Moral Machine, an online “platform for gathering a human perspective on moral decisions made by machine intelligence,” two MVHS students and one teacher shared their opinions on how the car should act in the various situations about ethics. For example, junior Ditto Rajpal believes that the autonomous car should always choose the option that kills the lowest number of people.
“Saving the most amount of people would be in the best interest,” Rajpal said. “But there’s always going to be some sacrifice.”
While this option may seem obvious, the situation is more complicated when there are two scenarios in which the same number of people are at risk - no matter what the car does. Is there a difference if one group is made up of elderly people and the other of college-aged adults? For some, the decision is easy, while others believe there are many more factors to consider than solely the age of the pedestrians.
This dissention of ideals makes it challenging for our society to come to a consensus on how to deal with these situations. However, English teacher David Clarke thinks that a communal agreement must be made, rather than letting the autonomous cars’ manufacturers decide.
“Whether or not the decision should be the old people or a woman pushing a baby carriage or something like that, those are decisions that we should be able to make collectively,” Clarke said. “I don’t think they necessarily should be made by a private company if those machines are traveling on a public roadway. That ought to be a decision made by the society.”
Another situation on the Moral Machine is if one of the pedestrians or riders in question is an animal. This brings up a whole new set of arguments and counterarguments, and the only way of supporting a position is on one’s own moral and ethical values. Sophomore Rukmini Banerjee believes that no one has the right to prioritize any form of life over another.
“I don’t care if it’s an animal life or a human life,” Banerjee said. “You’re basically saying the life of the cat isn’t worth as much as the life of the humans. I think life is life no matter what and I won’t choose to prioritize one over the other.”
Since there are many conflicting beliefs, it seems like our society may never really be able to come to an agreement on how to program these self-driving cars. But autonomous cars are already starting to roam the streets, as seen by Google’s Waymo project, and it won’t be long before they become a regular commodity — regardless of society’s moral opinions.
“I think part of the fear that people have of this is not even a technological fear,” Clarke said. “I think it’s a fear of waking up and understanding that resources are limited, that lives are at stake, and that there are going to be negatives consequences no matter what.”