Take our user survey and make your voice heard.
world

Uber self-driving car kills Arizona pedestrian

29 Comments
By Glenn CHAPMAN

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© 2018 AFP

©2024 GPlusMedia Inc.

29 Comments
Login to comment

They shouldn’t be allowed on the road until there is evidence that they are as good as a human (and better than the 'back up' driver in this car).

The ultimate goal as far as safety is concerned is that they will be much better than humans and that they will significantly reduce accidents and fatalities.

-1 ( +2 / -3 )

They shouldn’t be allowed on the road until there is evidence that they are as good as a human (and better than the 'back up' driver in this car).

That evidence already exists. Self-driving cars have a much less lower of accident rates than human drivers.

The ultimate goal as far as safety is concerned is that they will be much better than humans and that they will significantly reduce accidents and fatalities.

They already do. But everyone is freaking out because of this one fatality, ignoring the thousands and thousands of other fatalities that have been a result of human drivers during the time that self-driving cars had no fatalities.

4 ( +7 / -3 )

There isn't enough data yet on a broad variety of driving situations and types of roads. Most of what there is comes from Western states, highway driving and in good weather. It isn't possible to assess thoroughly until self driving cars can cover as many miles in a year and in as many circumstances as human drivers presently do.

2 ( +5 / -3 )

It isn't possible to assess thoroughly until self driving cars can cover as many miles in a year and in as many circumstances as human drivers presently do.

Nice catch-22 you've created here. They shouldn't be allowed on the roads until they've been shown to be safe, and they can't assess it as safe until they have been allowed on the roads.

4 ( +8 / -4 )

I've long maintained that the chief problem with these cars is going to be liability. Yes, people get killed by cars on a regular basis, but a driver is pretty much always responsible and typically is held accountable for incidents.

But what happens when it's automated? There is no driver, only a faceless company. Is the whole company liable? The vehicle manufacturer? The manufacturer of whatever sensor should have prevented it? The people who wrote the code? Seems like incidents will just lead to a whole lot of finger pointing.

10 ( +10 / -0 )

Nice catch-22 you've created here.

Before even talking about various levels of certification there needs to be first of all a universal understanding of what safe driving operations and parameters actually are. At the present every company working on software gets to set their own subjective guidelines. Without a more rigorous mathematical framework for determining fault, liability really is going to be arbitrarily determined by the legal department, or state regulators, or engineers themselves. Uber should definitely put their trials on hold because something unexpectedly entering the vehicle’s path is pretty much the first emergency event that self driving car engineers look at. The entire car has essentially been designed around preventing exactly this situation from occurring.

2 ( +3 / -1 )

Nice catch-22 you've created here. They shouldn't be allowed on the roads until they've been shown to be safe, and they can't assess it as safe until they have been allowed on the roads.

Except thats not what Lizz said. This is what Lizz said;

It isn't possible to assess thoroughly until self driving cars can cover as many miles in a year and in as many circumstances as human drivers presently do.

In other words, your claim that self-driving cars are safer is a proven fact is woefully erroneous. There hasn't been nearly enough testing done, as everyone knows.

-7 ( +0 / -7 )

Except thats not what Lizz said. This is what Lizz said;

It isn't possible to assess thoroughly until self driving cars can cover as many miles in a year and in as many circumstances as human drivers presently do.

You must not have read her comment previous to that:

They shouldn’t be allowed on the road until there is evidence that they are as good as a human (and better than the 'back up' driver in this car).

So yes, it is what she said. She first claimed they should not be allowed on the road without evidence, then she declared that they need to be on the road to get that evidence.

4 ( +5 / -1 )

But says nothing as to the reliability of driverless cars.

There have been tens or hundreds of thousands of road-hours of driving of self-driving cars. And one pedestrian fatality. Human drivers have caused many, many more in that time.

1 ( +3 / -2 )

There have been tens or hundreds of thousands of road-hours of driving of self-driving cars. And one pedestrian fatality.

This car also presumably passed basic safety tests before the accident. And self driving cars do not have to be thoroughly tested for there to be evidence of safety. They do need to be engineered to respond safely to software malfunctions, near crashes, loss of traction and other risks of the road and of technology. Self-driving vehicles need to be able to do more than just avoid causing accidents. They also need to be programmed to take the kind of common-sense steps human drivers would take to prevent accidents, even minor fender benders and even when they're technically the fault of another driver.

3 ( +3 / -0 )

I am very interested to see what the cause of this crash was. Was the pedestrian jaywalking or crossing at a designated crosswalk, etc.? I find it almost impossible to believe this crash was caused by faulty car software and am pretty sure we will see some sort of human error at play, as always.

Either way, autonomous car testing should not be stopped for this

2 ( +4 / -2 )

I am very interested to see what the cause of this crash was. 

My first thought was what happened and did the autonomous car make this accident more likely. I have my doubts since there was a backup driver. (of course assuming that driver was doing his job).

The NTSB doesn't normally get involved with a common car accident, but they will in this case to see if there is an inherent issue that can be fixed to improve the safety.

My guess, the pedestrian would have died regardless of whether there was a driver or not.

0 ( +1 / -1 )

I just see a transition period wth lots of negatives. There’s no way they can account for the millions of different situations a car would see. It literally will be about paching after each death.

Then you will have some group that will be able to exploit something no one thought of, like criminals or terrorists or some scammers who lock your car until you pay a ransom. Or something as simple as kids thinking they are playing a fun trick by modifying a stop sign so cars can’t see it.

That doesn’t mean it shouldn’t be done. I’m just saying it just seems to be happening a bit fast. A lot of it will have to be trial and error in the real world.

-2 ( +1 / -3 )

A report says that the woman was crossing the street pushing a bicycle mid-street at night, and the car was travelling at 40mph without accelerating.

Seems like an avoidable accident for the woman, the car and the backup driver.

2 ( +3 / -1 )

There have been tens or hundreds of thousands of road-hours of driving of self-driving cars. And one pedestrian fatality. Human drivers have caused many, many more in that time.

Not in real world situations in selected areas. It’s still in the teething process and much to early to draw any conclusions.

-2 ( +2 / -4 )

Ok, I'll agree with you on that. I don't think they should just suddenly be let out full-force either. An integrated approach over time is the best solution.

1 ( +1 / -0 )

Self-driving works well when the weather is nice. The sensors don't work very well in rain, snow, ice, fog or at night with any of those conditions.

Things that are obvious to a human driver are not obvious to computers. Eventually, the programmers will figure all this stuff out and make self-driving vehicles much safer, but in the meantime, there will be bicyclists in the road.

4 ( +4 / -0 )

The sensors don't work very well in rain, snow, ice, fog or at night with any of those conditions.

Interesting point - though, theoretically, they should if equipped with infrared sensors that can "see" through weather.

As I understand, Waymo is to be used only on highways, with users setting up depots at high exits. Trucks will self-drive the majority of the route, with a driver hopping in to take manual control over the "last mile."

0 ( +0 / -0 )

My biggest concern is the possibility of these self driving cars being hacked and turned into killing machines 。。。

2 ( +2 / -0 )

My biggest concern is the possibility of these self driving cars being hacked and turned into killing machines

That can happen with human-driven cars.

3 ( +4 / -1 )

StrangerlandToday  12:38 pm JST

My biggest concern is the possibility of these self driving cars being hacked and turned into killing machines

That can happen with human-driven cars.

Do you mean like at Charlottesville? That wasn't a hack, that was the car being radicalized.

1 ( +1 / -0 )

Don’t break the law and you won’t gwt hit by a self driving car. Pretty simple. Person was jay walking. Uber won’t be held responsible.

-2 ( +0 / -2 )

The only way I’d absolve the car of wrongdoing is if the victim was completely not seen before, for example, jumping out between parked trucks. Self driving algorithms need to be able to deal with pedestrians and a lack of crosswalks as most non developed residential or country roads don't even have them, at least as well as normal drivers.

1 ( +1 / -0 )

BieberHole69 # Don’t break the law and you won’t gwt hit by a self driving car. Pretty simple. Person was jay walking. Uber won’t be held responsible.

By the law, You have to take some degree of responsibility for death of pedestrian if you're behind the wheel and your car hit the pedestrian regardless of that person was jailed-walking or not. The driver has to pay attention all the times on the road when you're behind the wheel. The driver must brake or slow down when the driver saw pedestrian is crossing the street. If you don't do it and then it's your fault.

California Law; Thus, for instance, in a case in which a pedestrian was crossing in the middle of a street outside of a crosswalk when hit by a vehicle driver who was not paying sufficient attention, there may be a finding that the plaintiff was 25% negligent and the driver was 75% negligent.

0 ( +0 / -0 )

There will need to be liability laws to protect the corporations from being bankrupted when the inevitable accidents occur. An accident that caused property damage, injury, or death can only be blamed on the manufacturer and not a careless or malicious driver.

I am for this technology as I think it will eventually lead to fewer accidents and fatalities when combined with road sensors. But when the inevitable happens how can we justify holding the corporation unharmed? As with any complex machine someone will make a mistake that leads to tragedy. Commercial airlines are very safe in general but disasters do occur - and they have highly trained pilots. I would imagine that self driving cars would cause many times more deaths than planes though likely much less than with non-autonomous vehicles.

0 ( +0 / -0 )

Apparently, this accident was entirely the person who was killed's fault, as they stepped out in traffic with no warning, and no opportunity for the car to react. The car had cameras all over it, so they have video footage that they are able to review.

2 ( +2 / -0 )

Apparently, this accident was entirely the person who was killed's fault, as they stepped out in traffic with no warning, and no opportunity for the car to react. 

Is that according to the police and Uber ? Uber employees are either employees covered under labor laws or independent contractors each owning his or her single car business, depending on the state and country involved so it isn't likely they are going to go bankrupt over these cases.

0 ( +0 / -0 )

I saw the video. Don't think any driver would have missed her.

She was in a 2nd lane crossing a clear street at night. No rain. No fog. The pedestrian should have easily seen the vehicle's lights - EASILY. Not crossing at a crosswalk. Appears to have picked a dark part of the street to cross wearing dark clothes. 20 ft away was a well lit area.

To me, it looked like the pedestrian didn't care if she was hit or not. She was crossing a large street like it was a college campus with 25K students all walking around with 2 vehicles on the road. That was not the situation.

In theory, an automated system should see better at night - night vision or IR cameras wouldn't miss someone crossing the street. Guess Uber doesn't use those.

I think I would have hit the woman too.

0 ( +0 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites