The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.
© 2018 AFPUber self-driving car kills Arizona pedestrian
By Glenn CHAPMAN SAN FRANCISCO©2024 GPlusMedia Inc.
The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.
© 2018 AFP
29 Comments
Login to comment
Lizz
They shouldn’t be allowed on the road until there is evidence that they are as good as a human (and better than the 'back up' driver in this car).
The ultimate goal as far as safety is concerned is that they will be much better than humans and that they will significantly reduce accidents and fatalities.
Strangerland
That evidence already exists. Self-driving cars have a much less lower of accident rates than human drivers.
They already do. But everyone is freaking out because of this one fatality, ignoring the thousands and thousands of other fatalities that have been a result of human drivers during the time that self-driving cars had no fatalities.
Lizz
There isn't enough data yet on a broad variety of driving situations and types of roads. Most of what there is comes from Western states, highway driving and in good weather. It isn't possible to assess thoroughly until self driving cars can cover as many miles in a year and in as many circumstances as human drivers presently do.
Strangerland
Nice catch-22 you've created here. They shouldn't be allowed on the roads until they've been shown to be safe, and they can't assess it as safe until they have been allowed on the roads.
CrazyJoe
I've long maintained that the chief problem with these cars is going to be liability. Yes, people get killed by cars on a regular basis, but a driver is pretty much always responsible and typically is held accountable for incidents.
But what happens when it's automated? There is no driver, only a faceless company. Is the whole company liable? The vehicle manufacturer? The manufacturer of whatever sensor should have prevented it? The people who wrote the code? Seems like incidents will just lead to a whole lot of finger pointing.
Lizz
Nice catch-22 you've created here.
Before even talking about various levels of certification there needs to be first of all a universal understanding of what safe driving operations and parameters actually are. At the present every company working on software gets to set their own subjective guidelines. Without a more rigorous mathematical framework for determining fault, liability really is going to be arbitrarily determined by the legal department, or state regulators, or engineers themselves. Uber should definitely put their trials on hold because something unexpectedly entering the vehicle’s path is pretty much the first emergency event that self driving car engineers look at. The entire car has essentially been designed around preventing exactly this situation from occurring.
clamenza
Except thats not what Lizz said. This is what Lizz said;
In other words, your claim that self-driving cars are safer is a proven fact is woefully erroneous. There hasn't been nearly enough testing done, as everyone knows.
Strangerland
You must not have read her comment previous to that:
So yes, it is what she said. She first claimed they should not be allowed on the road without evidence, then she declared that they need to be on the road to get that evidence.
Strangerland
There have been tens or hundreds of thousands of road-hours of driving of self-driving cars. And one pedestrian fatality. Human drivers have caused many, many more in that time.
Lizz
There have been tens or hundreds of thousands of road-hours of driving of self-driving cars. And one pedestrian fatality.
This car also presumably passed basic safety tests before the accident. And self driving cars do not have to be thoroughly tested for there to be evidence of safety. They do need to be engineered to respond safely to software malfunctions, near crashes, loss of traction and other risks of the road and of technology. Self-driving vehicles need to be able to do more than just avoid causing accidents. They also need to be programmed to take the kind of common-sense steps human drivers would take to prevent accidents, even minor fender benders and even when they're technically the fault of another driver.
dcog9065
I am very interested to see what the cause of this crash was. Was the pedestrian jaywalking or crossing at a designated crosswalk, etc.? I find it almost impossible to believe this crash was caused by faulty car software and am pretty sure we will see some sort of human error at play, as always.
Either way, autonomous car testing should not be stopped for this
viking68
My first thought was what happened and did the autonomous car make this accident more likely. I have my doubts since there was a backup driver. (of course assuming that driver was doing his job).
The NTSB doesn't normally get involved with a common car accident, but they will in this case to see if there is an inherent issue that can be fixed to improve the safety.
My guess, the pedestrian would have died regardless of whether there was a driver or not.
SuperLib
I just see a transition period wth lots of negatives. There’s no way they can account for the millions of different situations a car would see. It literally will be about paching after each death.
Then you will have some group that will be able to exploit something no one thought of, like criminals or terrorists or some scammers who lock your car until you pay a ransom. Or something as simple as kids thinking they are playing a fun trick by modifying a stop sign so cars can’t see it.
That doesn’t mean it shouldn’t be done. I’m just saying it just seems to be happening a bit fast. A lot of it will have to be trial and error in the real world.
viking68
A report says that the woman was crossing the street pushing a bicycle mid-street at night, and the car was travelling at 40mph without accelerating.
Seems like an avoidable accident for the woman, the car and the backup driver.
clamenza
Not in real world situations in selected areas. It’s still in the teething process and much to early to draw any conclusions.
Strangerland
Ok, I'll agree with you on that. I don't think they should just suddenly be let out full-force either. An integrated approach over time is the best solution.
theFu
Self-driving works well when the weather is nice. The sensors don't work very well in rain, snow, ice, fog or at night with any of those conditions.
Things that are obvious to a human driver are not obvious to computers. Eventually, the programmers will figure all this stuff out and make self-driving vehicles much safer, but in the meantime, there will be bicyclists in the road.
Laguna
Interesting point - though, theoretically, they should if equipped with infrared sensors that can "see" through weather.
As I understand, Waymo is to be used only on highways, with users setting up depots at high exits. Trucks will self-drive the majority of the route, with a driver hopping in to take manual control over the "last mile."
bones
My biggest concern is the possibility of these self driving cars being hacked and turned into killing machines 。。。
Strangerland
That can happen with human-driven cars.
katsu78
Do you mean like at Charlottesville? That wasn't a hack, that was the car being radicalized.
BieberHole69
Don’t break the law and you won’t gwt hit by a self driving car. Pretty simple. Person was jay walking. Uber won’t be held responsible.
Lizz
The only way I’d absolve the car of wrongdoing is if the victim was completely not seen before, for example, jumping out between parked trucks. Self driving algorithms need to be able to deal with pedestrians and a lack of crosswalks as most non developed residential or country roads don't even have them, at least as well as normal drivers.
Chop Chop
By the law, You have to take some degree of responsibility for death of pedestrian if you're behind the wheel and your car hit the pedestrian regardless of that person was jailed-walking or not. The driver has to pay attention all the times on the road when you're behind the wheel. The driver must brake or slow down when the driver saw pedestrian is crossing the street. If you don't do it and then it's your fault.
California Law; Thus, for instance, in a case in which a pedestrian was crossing in the middle of a street outside of a crosswalk when hit by a vehicle driver who was not paying sufficient attention, there may be a finding that the plaintiff was 25% negligent and the driver was 75% negligent.
Wolfpack
There will need to be liability laws to protect the corporations from being bankrupted when the inevitable accidents occur. An accident that caused property damage, injury, or death can only be blamed on the manufacturer and not a careless or malicious driver.
I am for this technology as I think it will eventually lead to fewer accidents and fatalities when combined with road sensors. But when the inevitable happens how can we justify holding the corporation unharmed? As with any complex machine someone will make a mistake that leads to tragedy. Commercial airlines are very safe in general but disasters do occur - and they have highly trained pilots. I would imagine that self driving cars would cause many times more deaths than planes though likely much less than with non-autonomous vehicles.
Strangerland
Apparently, this accident was entirely the person who was killed's fault, as they stepped out in traffic with no warning, and no opportunity for the car to react. The car had cameras all over it, so they have video footage that they are able to review.
Lizz
Apparently, this accident was entirely the person who was killed's fault, as they stepped out in traffic with no warning, and no opportunity for the car to react.
Is that according to the police and Uber ? Uber employees are either employees covered under labor laws or independent contractors each owning his or her single car business, depending on the state and country involved so it isn't likely they are going to go bankrupt over these cases.
theFu
I saw the video. Don't think any driver would have missed her.
She was in a 2nd lane crossing a clear street at night. No rain. No fog. The pedestrian should have easily seen the vehicle's lights - EASILY. Not crossing at a crosswalk. Appears to have picked a dark part of the street to cross wearing dark clothes. 20 ft away was a well lit area.
To me, it looked like the pedestrian didn't care if she was hit or not. She was crossing a large street like it was a college campus with 25K students all walking around with 2 vehicles on the road. That was not the situation.
In theory, an automated system should see better at night - night vision or IR cameras wouldn't miss someone crossing the street. Guess Uber doesn't use those.
I think I would have hit the woman too.