12 Moral Values in Everyday Life: The Moral Dilemma Behind Self-Driving Cars

Moral Values in Everyday Life: The Moral Dilemma Behind Self-Driving Cars

Levin, N. (2019). The “Trolley Problem” and Self-Driving Cars: Your Car’s Moral Settings. In N. https://www.ngefarpress.com/

We have examined the role of moral values in everyday life, and we will now look at how moral values must be taken into consideration when faced with a moral problem/dilemma.  For the sake of defining a moral dilemma, this is a situation where you must make a choice (hopefully, the better or best choice) between two or more moral values.  To help understand how this can occur in everyday life, we will examine The “Trolley Problem” and Self-Driving Cars: Your Car’s Moral Settings,  which is a fictional short story about potential moral settings for self-driving cars in the near future.

The “Trolley Problem” and Self-Driving Cars: Your Car’s Moral Settings

“We have decided to put the moral decisions related to our crash-avoidance and self-driving features into the hands of the consumer. After all, it is your vehicle and you will be behind the wheel. Or maybe you won’t be. Our proprietary system allows for you to customize all settings depending on who is in the vehicle, who is driving, and what conditions present themselves on the road,” Bob, the salesman, told Hillary.

“What do you mean ‘moral decisions’?” Hillary asked.

“There’s a classic problem in philosophy known as ‘The Trolley Problem’ and it’s a simple version of a scenario we like to analyze to help determine your settings. Are you familiar with it? No? It can help explain what we have in mind, so it’s a good starting point. Let me show you how it works,” Bob said as he touched his screen a few times causing a holographic projection of a red trolley (the type that were still in use on the streets of San Francisco) to appear on the table between them.

“You see, this is a runaway trolley and it’s flying downhill on the track. There’s a fork ahead and it’s headed toward the path to the right. You happen to be standing at the trolley tracks and your hand is on the lever controlling the fork, so you can choose which track it goes down,” Bob explained as the hologram zoomed out a little to reveal the fork in the tracks while a lever appeared in front of Hillary indicating the switch would currently send the trolley down the right path.

“OK, so what’s the issue? There’s no one in the trolley and I don’t know where it wants to go, so what do I need to do?” Hillary asked, a little intrigued by the simulation. She hadn’t bought a car in years and was unprepared for both the sophisticated level of technology in her potential vehicle and the tactics used by the salesmen.

“Correct, there’s no one in the trolley, but there are people down both of the tracks. You see, a mad philosopher – I know, bear with me, this was popularized by the philosopher Judith Jarvis Thompson after all – has tied down a number of people. Specifically, there are 5 people on the track the trolley is headed down and 1 on the other track,” Bob said as the hologram zoomed out further to show the situation. All 6 of the people on the tracks had indiscernible features to the point that Hillary couldn’t make out any details, not even general age or gender.

“Now, you have a decision to make,” Bob continued as the trolley slowly rolled toward the fork. “You can do one of two things: nothing or pull the switch. If you do nothing, the trolley will kill the 5 people. If you act and pull the lever, then the trolley will switch tracks and kill the single person. What will you do? All you need to do is pull that switch to change the track.”

“Those can’t be the only two options. Can I shout at those people? Untie them in time? Hop on the trolley and pull the brakes? Put the switch in the middle so the trolley derails?” she said as she delicately moved the switch trying to get it to stick in the middle, having no luck.

“Nope! There are only those two options. Anything else you try to do won’t work, and will take up any time you have to make a decision which would mean the trolley will continue down its track and kill the 5 people. That’s the point of this: you must choose between those two options,” Bob said.

Hillary thought as the trolley continued on its path toward the fork. It didn’t take her much longer to deliberate, and she set the lever to the track with the 1 person on it.

“Well, it’s kill 1 or 5, right? I’d rather less people die. This is all really the fault of the ‘Mad Philosopher’ anyway,” she stated.

“Most people make the decision you just made, and for those same reasons. However, there’s something to think about: rather than saving those 5 people, did you just kill 1? After all, that person wasn’t going to die until you did anything,” Bob asked.

“In some sense, I suppose I did, but someone was going to die. If I didn’t do anything, then those 5 people would die, and there was something I could have done to prevent that. It’s unfortunate for that one person, but I’d hope they would understand,” she stated.

“There are those people that believe there is a big moral difference between allowing something to happen and doing it yourself. But I can tell that you don’t think so. Whatever happens in this situation is in your hands, and you’re ultimately partly responsible for the results it would seem. That’s why it’s so important that you think about what settings you want to use in your new car,” Bob stated.

“But what settings are there exactly? If it’s just to choose how many people to save, then that’s easy: save more people. Why wouldn’t I pick that option?” Hillary asked.

“Great question! Let’s change this up a little. Instead of the train heading down the track with the 5 people, it’s headed down the track with only 1 person,” as Bob said this, the 5 people shifted to the other track in the hologram, but the track the trolley was headed down remained empty.

“Now, Hillary, I don’t mean any offense by this, and I don’t want you to be shocked, but we’ve found this to be very effective at helping you decide which settings are correct for you and your family,” and, as he said this, a little girl just five-years-old appeared on the track the Trolley was headed down.

“Vanessa? What is she doing there? Why is my daughter in your little game?!” she shouted.

“My apologies, I’ll take her out,” as he said this, the girl was replaced with a boy version of Vanessa. “You get the point. What would you do if you had to choose between the life of your daughter and 5 strangers?” Bob asked.

“Well, my daughter, of course! What kind of monster would kill their own child?” Hillary shouted again.

“Of course you would save her. But remember what you had just said about saving more lives than less – that only matters to you when you don’t know anything about the people that might be killed. Now, I assure you, that there are people that choose to, sadly enough, sacrifice their own child to save those 5 lives. It’s not an easy decision to make, but they believe their child would want to make that sacrifice,” Bob said, attempting to calm Hillary. He was clearly used to these sorts of reactions.

“I really don’t see what this has to do with my new car. When would my daughter be in the road and the only way to avoid her is by killing 5 strangers? And where am I? I’m not even a part of this. I’m just standing on the outside,” Hillary asked.

“You’re right! You are not really involved in this situation, so it isn’t exactly something that might happen when you’re driving your new car. This is why we’ve devised a different scenario to help you understand the actual options in our moral decision-making algorithm and choose the most appropriate parameters for your family. The Trolley Problem was just the beginning so you can better appreciate this scenario,” Bob said as he began tapping at his screen once again. The holographic simulation of the trolley disappeared and the minivan that Hillary was considering appeared driving down a curvy mountain road.

“Now, let’s say you’re out for a relaxing mountain drive. You need some time to yourself and just needed to get away, so you’re blasting your favorite tunes on our 12-speaker award-winning sound system using our safety assist driving mode, which puts you in control but keeps an eye on things so that you don’t get into an accident, not that you would, but just in case you lose yourself too much in your music, the car has you covered. Suddenly, you turn a corner and see a family of 5 standing in the middle of the road since they just emerged from their wrecked car when it hit a boulder that had recently fallen off the mountainside. They were driving a very old car that didn’t have the latest auto-braking technologies, but had all thankfully survived with only minor injuries. You’re in control of the wheel and have just enough time to react. You see that there are only 2 options: smash into the back of their car and risk serious injury to yourself or avoid it and hit the family. Plunging off the cliff would be certain death (we don’t have ejection seats – yet!), so it’s either hit the family or take your chances crashing. Our safety systems are topnotch and we receive the highest crash rating year after year. But you never know what’s going to happen in an accident like this. The family might be fine as well if you hit them or they might jump out of the way in time, but they aren’t surrounded by 7 different airbags like you are. Now, what would you do?” As Bob explained the scenario, the hologram changed to match. Only instead of a switch, there were two buttons in front of Hillary: one that said “Hit the car” and another that said “Hit the family.”

“I can’t help but see my family standing there when I look at these people. I couldn’t hit them, so I’d take the chance on myself,” Hillary said as she pushed the button indicating her choice. The hologram stopped just before impact.

“That’s quite noble of you. But – now, again, don’t be shocked – what if your family were in the car with you? Let’s keep it limited. Not your entire family, but just you and…Vanessa, was it?” Bob asked, now sounding more like a casket salesman than a car salesman.

“I can’t risk her like I will risk myself. My job as a parent is to protect her. That family shouldn’t have been standing in the middle of the road anyway, so it wouldn’t be entirely my fault,” she stated, reassuring herself of her decision.

“Of course, I understand. From what you’ve said, it sounds like you’re interested in our most popular settings, which a full 91% of new car owners choose. There are a lot of options, but I can summarize how it will work out in the real world for you, if you like,” Bob said as a list with at least 50 checkboxes appeared in place of the simulation.

“Please, go on,” Hillary said.

“You’re willing to sacrifice yourself for the sake of others – again, that’s very commendable of you – and always want to save the most lives that you can, unless your family is on the line–” Bob said before Hillary interrupted him.

“No, not my entire family, just children. Nephews, nieces, and any other kids included,” Hillary clarified.

“And grandchildren someday, of course! My apologies, I misspoke. Yes, we will prioritize the lives of any children in your new vehicle, but count adults in your vehicle the same as those outside of it. While we’re on the topic, should we also prioritize the lives of children outside the car as well? We can put whatever weighting on them you like when the system does its calculations in the unfortunate event of unavoidable catastrophe. Most customers opt for a 2:1 value ratio – minors count twice as much as adults. However, we do have customers that value the elderly at a higher rate. We can set the parameters however you want, and we need not make these decisions now,” Bob said.

“2:1 for children under 15, and 1.5:1 for children between 16-21. Anyone older than that can fend for themselves,” Hillary stated.

“Yes, of course. You can always change your LVRs, Life Value Ratios, later, and tune them exactly to your liking based on whatever traits you choose to specify. There’s even a specific setting for ex-husbands!” Bob joked. Hillary let out a forced chuckle.

“Sorry, to continue, the system will prioritize the most lives, weighted to your specifications, weighing adults inside and outside the vehicle the same, but giving a higher value to children, and giving those in the car the highest values. There is also a setting that will give the occupants in the car priority in the event the algorithm results in a tie. Sound good?” Bob explained.

“Yes, that sounds quite good. Does a sunroof come standard?” Hillary inquired. All this safety talk was getting tedious, and she suddenly remembered how much she enjoyed the sunroof on her old car.

 

Questions to Ponder:

  1. What are your thoughts on the “classic” Trolley Problem? Would you switch the track (to kill the one person) or let it kill the five people? Do you think there is a difference between “doing” and “allowing” in this scenario?
  2. What moral settings would you use on your self-driving car and why?
  3. Who would be responsible for a fatal accident in the event that a self-driving car injures someone because it made a decision based upon a moral algorithm? The driver? The car company? Nobody? Why?

 

 

 

 

 

License

Icon for the Creative Commons Attribution 4.0 International License

Moral Values in Everyday Life: The Moral Dilemma Behind Self-Driving Cars Copyright © 2020 by Levin, N. (2019). The “Trolley Problem” and Self-Driving Cars: Your Car’s Moral Settings. In N. https://www.ngefarpress.com/ is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book