tech

Ethics dilemmas may hold back autonomous cars: study

17 Comments

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© 2016 AFP

©2024 GPlusMedia Inc.

17 Comments
Login to comment

There is an old saying: You will die of old age if you are waiting on perfection.

3 ( +4 / -1 )

One thing I can't believe is that drivers will really give up their right to "be in a hurry" ? Or somehow that the "impatient" gene will be completely phased out of human DNA?

If automated cars become a thing of reality, humans will all of the sudden will be willing to go the same speed as everyone else? Not have the urge to pass a car? Not be annoyed of being behind a bus? Not want to make a yellow light? Not be late to work?

I can't see a world where the aggressive driver will willingly give up their automotive freedoms and enjoy feeling stuck in a uniformed moving line.

Can you override an automated car's computer in an emergency or must you just sit and take it like a roller coaster and watch your possible death happening with no ability to stop it? Talk about horror.

2 ( +4 / -2 )

Difficult question... even more if you take the AI and change it to a person... what that person is going to choose? Before imposing to the AI to make a decision human should do the same exercise... considering all the same (two persons, same sex, same age, same social/physical/psychological conditions) the only difference one is in the car the other is outside the car and you are the car. And the situation is you save one but the other one dies... which one you choose?

If I was to have to make that decision (let's say I am the AI or the human remotely controlling the car)... probably will choose to save the life of the passenger(s), since he/she/they are under my responsibility to "take them safely from point A to point B" ... but that is only me.

-1 ( +1 / -2 )

To ask an autonomous vehicle to make life and death decisions regarding who should live an die is absolutely ludicrous and totally unnecessary. THIS IS A NON-ISSUE! Here are the reasons:

In the first place, all cars are required to have seat belts, air bags and a myriad of other safety features which are in place to the save the lives of drivers and passengers alike. As time goes by and autonomous cars become more prevalent these features will be further enhanced and will save even more lives if the car is involved in an accident.

Unfortunately, pedestrians, bike riders, and animals do not have these protective features available. So, in the case of an accident with a car (autonomous or otherwise) they will undoubtedly suffer catastrophic injury and or death if struck by a vehicle.

There is no justifiable reason, moral or otherwise, that should allow a vehicle to swerve or decide that there is a choice to be made regarding whether to hit another living object as opposed to crashing into an inanimate object or going off the road. You are the one surrounded by the car and its safety features–they are not!

When we get in a car as a driver or passenger we readily accept that something bad may happen. This is our conscious decision, and yes, we may be in an accident. But in accepting this decision we must also recognize that we do not have the right to consciously take away the rights of others. Just because someone decided to walk to the store, ride their bike or walk their dog on the sidewalk or crosswalk does not give us, or the car, the right to decide their fate. Our decision was made when we turned the ignition. So take your lumps, hit that concrete bunker or go off the road, but do so with the knowledge that it was your free wheel that made the decision to drive.

-3 ( +1 / -4 )

Why does there have to be clear and concise rules on the ethical actions of autonomous vehicles, we don't have clear and concise rules on the ethical actions of humans!

3 ( +4 / -1 )

so programmers are now in charge of your life. FFS... get rid of the programmers... we didn't need this solution to a problem that wasn't there.

AI should have been nothing more than a driving assistant, not your boss

The only people who need ethics training are the programmers, who are about to unemploy millions just because they can

-6 ( +2 / -8 )

So if you program a device to kill people, that is not murder?

Who are these experts on ethics who advising the developers?

Are they not subject to law?

0 ( +2 / -2 )

It's ludicrous that people would rather choose a solution that will result in more deaths - human drivers - than one that will result in significantly less - self-driving cars - simply because they would rather someone die at a human hand than a machine.

Can't see the forest for the trees.

-1 ( +2 / -3 )

sf2kJUN. 27, 2016 - 12:16PM JST so programmers are now in charge of your life. FFS... get rid of the programmers... we didn't need this solution to a problem that wasn't there.

I propose to you a thought experiment:

You're walking in the meadow when you hear a rumbling in the distance and you see a herd of buffalo stampeding straight toward you. You have three options. A. Get out of the way, do your best to minimize the injury you suffer, and adapt to the world in their wake. B. Using your knowledge of the buffalo, try to climb on top of them and steer them so that they aren't destructive and they are maybe even useful. C. Hold up your hands and say, "Hey guys, you need to stop."

Which of these options is the best option for you is going to depend on your knowledge and skill set, but only one of them is very definitely going to get you killed 100% of the time. Let's take a few minutes to contemplate which one that is and how this thought experiment relates to automation in the workplace.

0 ( +1 / -1 )

"Not all crashes will be avoided, though, and some crashes will require AVs to make difficult ethical decisions in cases that involve unavoidable harm,”

If everyone used auto vehicles and they were limited to 5 MPH (8KPH) then they would reduce harm to almost zero.

But that is not a possible option. Who would want to go that slow. Even 15 MPH would be too slow for most, if not, everyone.

But if the limit was 50 MPH motorway and 20 MPH urban then surviving would be higher, especialy for pedestrians.

But there will be ones who do not like the idea of not being in control of a device that could fail and kill. Nothing is fool proof.

You go roud a bend and come face to face with an idiot (manual driver) overtaking someone. The dangerious over taker has 2 + people in it, the other has 1 in it and your car has 1 in it. Would your car swerve into the other 1 person car to reduce hitting the overtaker with 2 ? 2 lives harmed instead of 3 and the dangerous overtaker then does a dissapearing act.

0 ( +3 / -3 )

Why does there have to be clear and concise rules on the ethical actions of autonomous vehicles, we don't have clear and concise rules on the ethical actions of humans!

Autonomous vehicles need rules of what to do in any circumstance - they don't care about the ethics, but they must have rules to operate. The rules are provided by programmers. When an object moves in front of a car, there are options such as brake, swerve, brake and swerve and various intermediate degrees such as brake sharply, brake gently, etc. So imagine an old lady steps in front of your car 15 meters ahead when you are doing 50kph. You brake and are about to swerve but then see a bunch of kids to your side. Generally, us humans don't have enough time to make a rational decision in such circumstances so are perhaps excused any ethical concerns. But we do need to program an autonomous car.

let alone account for different cultures with various moral attitudes regarding life-life tradeoffs

I've read that attitudes to such moral dilemmas are fairly universal, irrespective of culture, religion, etc. (e.g. most people think it wrong to actively kill someone in order to save ten others, but may think it morally acceptable to allow one person to die in order to save ten others.) Does anyone have further information?

0 ( +0 / -0 )

More trouble than they are worth, probably expensive to buy and to service or fix when the electronics go wrong. Maybe good for old folks in remote communities.

Nice theoretical brain exercises though...

0 ( +0 / -0 )

katsu78

still not your problem

-2 ( +0 / -2 )

If a human driver swerves to avoid an oncoming car in the described scenario, is that person responsible for what happens? Isn't the responsibility with the oncoming car that has caused the accident?

Replace the human driver with a machine and the oncoming car is still in the wrong.

In a two seconds from accident situation, very very few humans will be able to react in a manner that ensures the best outcome (whatever that may be) in every situation, so why should machines be held up to that standard?

1 ( +1 / -0 )

I can't see a world where the aggressive driver will willingly give up their automotive freedoms and enjoy feeling stuck in a uniformed moving line. yes there will be people that still prefer to drive their own vehicles but cost will probably dictate which way they go. driverless vehicles will be less accident prone to human drivers, insurance premiums will probably become very expensive for human drivers compared to driverless vehicles. As far as the moral debate I think it would be immoral to not introduce a technology that will drastically reduce the road toll and save lives.

0 ( +0 / -0 )

I think it would be immoral to not introduce a technology that will drastically reduce the road toll and save lives.

I agree with that, but am not convinced driverless vehicles are the answer.

0 ( +0 / -0 )

I agree with that, but am not convinced driverless vehicles are the answer.

Even though they will reduce road toll and save lives?

-2 ( +0 / -2 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites