# "Self-driving" Uber kills Arizona woman



## phrelin (Jan 18, 2007)

From Uber self-driving car kills Arizona woman, realizing worst fears of the new tech or another of the many stories online at this moment you can learn that (a) the car had "a safety driver at the wheel" and the woman "was crossing outside the designated crosswalk at about 10 p.m."

I'm not sure what the "safety driver" was doing other than _not_ his/her job.

I thought about posting this in the thread First Ground Up Driverless Vehicle To Be On Road in 2015 but decided it was just too long and too old.


----------



## James Long (Apr 17, 2003)

Unfortunate and expected. Hopefully the car collected enough data to show why the pedestrian was not avoided.


----------



## phrelin (Jan 18, 2007)

And we have another new case Tesla's Autopilot Was Involved in Another Deadly Car Crash in which we learn:

"Tesla now has another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a Model X SUV slammed into a concrete highway lane divider and burst into flames on the morning of Friday, March 23. The driver, Wei Huang, died shortly afterwards at the hospital.

"Based on data pulled from the wrecked car, Tesla says Huang should have had about five seconds, and 150 meters of unobstructed view of the concrete barrier, before the crash. Huang's hands were not detected on the wheel for six seconds prior to the impact. Earlier in the drive, he had been given multiple visual warnings and one audible warning to put his hands back on the wheel.

"Drivers need to be ready to grab the wheel if the lane markings disappear, or lanes split, which may have been a contributing factor in this crash. Systems like Autopilot have known weaknesses. The manual also warns that it may not see stationary objects, a shortcoming highlighted when a Tesla slammed into a stopped firetruck near Los Angeles in January. The systems are designed to discard radar data about things that aren't moving, to prevent false alarms for every overhead gantry or street-side trash can."​
I guess we have to understand that auto manufacturers can't protect drivers from their own sad errors.


----------



## jimmie57 (Jun 26, 2010)

phrelin said:


> And we have another new case Tesla's Autopilot Was Involved in Another Deadly Car Crash in which we learn:
> 
> "Tesla now has another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a Model X SUV slammed into a concrete highway lane divider and burst into flames on the morning of Friday, March 23. The driver, Wei Huang, died shortly afterwards at the hospital.
> 
> ...


I wonder if they could cause the car to slow to a stop when the driver does not heed a number of warnings to take the wheel ?


----------



## James Long (Apr 17, 2003)

Meanwhile: Uber reaches settlement with family of woman killed by self-driving car

Dashcam video of deadly self-driving Uber crash released
(I'd like to see the computer readout from the detection systems to see if there was any hint of the impending crash.)


----------



## scooper (Apr 22, 2002)

"Self driving" cars scare me almost as much as cell phones being used (especially for non-talking activities) while driving. Doubly so since i ride a motorcycle as well.


----------



## James Long (Apr 17, 2003)

scooper said:


> "Self driving" cars scare me almost as much as cell phones being used (especially for non-talking activities) while driving. Doubly so since i ride a motorcycle as well.


I agree. I have been tailgated (close to a .5 second following distance) by people who spend more time looking down than looking ahead. I see that in the mirror and it is distracting. One should not need to drive the car behind them ... one should only need to worry about the road ahead and maintaining a safe following distance and awareness of when to slow or stop their own car. But with a distracted tailgater I am looking for escape routes (If I need to stop should I leave my lane so the distracted tailgater has a place to go without hitting me?) Tailgater or not, when I stop in a line of traffic I normally leave a car length in front of me in case I get hit from behind (and only close up that space when several cars are stopped behind me).

The lane drifters are not much better but they usually can be avoided. A driver going the opposite direction drifting in to opposing traffic can be a problem but one does not have to deal with it more than a few seconds. Following a distracted driver who does not have lane control can be scary. I usually give them a little extra room so if they hit someone or something I can avoid running in to their accident. When I was younger at least the bad drivers were paying attention to the road and not some handheld device. I would not mind seeing those bad drivers have their licenses taken away. Perhaps the car could drive better than them (even at the current accident rates).

There is a lot of promise to self-driving. A distant future where the cars talk to each other would lead to more efficient driving. But the aggressive drivers probably would not like it if their car passively let another car "take their spot" to create better traffic flow for the majority of vehicles. But on my daily commute I am constantly wondering if the commute would be better or worse if the Internet of Things was in control. There are several four way stops on my route that could be run through at full speed if your vehicle knew that no other vehicles were present (and the law was modified to allow such action). There are traffic lights that could be timed better if the light knew a vehicle needing to make a left turn was approaching 60 or 90 seconds before it arrived instead of entering the turn lane 10 seconds out (or a sensor at the stop line).

Going back to the four way "stop" ... if two cars approached at the posted speed limit on a collision course an algorithm could be created to slow one car slightly enough to avoid a collision instead of forcing both cars to stop at the intersection. The more the system knew about where each car was going the more traffic could be sorted. For example, if I know that after turning at a four way stop I will be slowing to turn at a driveway 100ft down the road I will often slow my approach to the four way stop to encourage a conflicting vehicle to go through ahead of me (typically the first car to stop is the first car to go so I slow my approach to make the other car stop first and feel comfortable about going before I am stopped). The alternative would be me reaching the intersection first and turning in front of a vehicle with much further to go than 100ft - slowing their travel. Under full communication that conflicting vehicle may not stop at all and pass through the four way intersection at full speed, allowing me to turn and follow them and then make the next turn.

There is a lot of promise to self-driving. But we are still a long way away from the benefits that I seek. The programmers are having enough trouble with the challenges of getting each single vehicle from point A to point B. Challenges with unmarked roads and roads with faded markings. Challenges with the vast majority of other vehicles and all pedestrians being under human control and humans being so unpredictable. Driving a car is not a simple task and the consequences of an error is very high. Failure is not an option.


----------



## NR4P (Jan 16, 2007)

Humans make far more errors than self driving cars. These recent crashes are very very significant. Not downplaying a bit. But in the near future, these issues will be solved. But after all many many decades, people still drink and drive, do drugs and drive, drive will too sleepy, take eyes of the road, text, read newspapers (yeah have seen that) and read maps both paper and electronic.

I think we will all benefit from self driving vehicles soon. Especially elderly folks will no longer be home bound and their quality of life will improve.

Human error is the number one cause of traffic deaths.


----------



## James Long (Apr 17, 2003)

I would not bet on "soon" if that definition is a matter of years and not more than a decade.


----------



## NR4P (Jan 16, 2007)

James Long said:


> I would not bet on "soon" if that definition is a matter of years and not more than a decade.


I'll take that bet. Three to Five Years. It will be somewhat common and legal in at least a few States. Demand is too high. After all we went from a basic rocket to men on the moon in less than a decade. And think about the tech difference now vs. then.

Demand is too great. https://jalopnik.com/california-retirement-community-is-the-latest-stomping-1819177216


----------



## scooper (Apr 22, 2002)

Too many issues for them to solve that quickly. The programmers / developers HAVE to make these work with the HUMAN controlled vehicles (heck, even just humans) with zero error causing death. And don't go forcing the non-autonomous vehicle operators to have to make mods to their vehicles.

as I've stated elsewhere - They have to make these work in all weather conditions with ALL other vehicles, some of which may not be operated with the same degree of precision as the autonomous vehicles. Or you're going to need to designate what roads they can be used on.


----------



## boukengreen (Sep 22, 2009)

NR4P said:


> I'll take that bet. Three to Five Years. It will be somewhat common and legal in at least a few States. Demand is too high. After all we went from a basic rocket to men on the moon in less than a decade. And think about the tech difference now vs. then.
> 
> Demand is too great. https://jalopnik.com/california-retirement-community-is-the-latest-stomping-1819177216


What are you going to do during a snow storm when snow blocks the camera and forces the car to stop, or the car will not move due to low laying trees, or you're on a unpaved country road and the car that just passed you throws sand in the area of the camera.


----------



## NR4P (Jan 16, 2007)

boukengreen said:


> What are you going to do during a snow storm when snow blocks the camera and forces the car to stop, or the car will not move due to low laying trees, or you're on a unpaved country road and the car that just passed you throws sand in the area of the camera.


Cameras? There are many more sensors than that. Planes fly in blinding rain and snowstorms too. 

Cute article here 10 Astonishing Technologies that Power Google's Self Driving Cars

But there are thousands of companies working in this tech area. Lots of investment opportunities too?


----------



## James Long (Apr 17, 2003)

Elaine Herzberg would disagree with you on how well the sensors work.
(Unless you think that improvement is needed to detect and avoid more obstacles.)


----------



## SamC (Jan 20, 2003)

The purpose of science is to discover what is true. It is not to invent anything you can imagine. Many things simply are not possible. 

The quasi-religious faith that this "will" be worked out eventually is really baseless. There is no particular reason to believe that a potential invention "will" happen. It may, and these incidents (and per mile driven the death rate is many 1000s of time higher than with people driving even though the vehicles are in 100% maintance condition and are only used on cherry picked roads that avoid other conditions where the system will not work at all) seem to be pointing us in the direction that this is just a pipe dream.


----------



## James Long (Apr 17, 2003)

The death rate was hard to calculate before the three people died (two drivers, one pedestrian). The fault rate remains at "0%" ... which seems to be the most human that the AI can achieve: "it was not my fault".

Perhaps the next Tesla incident will broadside a school bus and give the death numbers a boost. Tesla seems to continue to be the most "bold" of the manufacturers pushing "auto pilot". It is only a matter of time before their vehicles are involved in a non-driver fatality. Hoping that it will not happen will not prevent a fatal incident. I do not want one to happen ... but I am a realist.


----------



## yosoyellobo (Nov 1, 2006)

James Long said:


> The death rate was hard to calculate before the three people died (two drivers, one pedestrian). The fault rate remains at "0%" ... which seems to be the most human that the AI can achieve: "it was not my fault".
> 
> Perhaps the next Tesla incident will broadside a school bus and give the death numbers a boost. Tesla seems to continue to be the most "bold" of the manufacturers pushing "auto pilot". It is only a matter of time before their vehicles are involved in a non-driver fatality. Hoping that it will not happen will not prevent a fatal incident. I do not want one to happen ... but I am a realist.


Is Tesla providing liability insurance for its self driving car?


----------



## James Long (Apr 17, 2003)

yosoyellobo said:


> Is Tesla providing liability insurance for its self driving car?


At additional cost: InsureMyTesla


----------



## NR4P (Jan 16, 2007)

Interesting how many folks don't see this as a likely future. Imagine if we stopped the Apollo missions after Apollo 1 burned up on the launchpad. Or Challenger blew up in the sky. And yes, early airplanes with people on them crashed. And so do more current ones. But we fly into space or fly across the continent to visit grandma, or the globe.


----------



## James Long (Apr 17, 2003)

Apollo and Challenger killed volunteers who knew they were putting their lives on the line. We still do not have routine space travel for regular people. The closest we have come is rich people who meet stringent health standards being taken into space by the Russians as tourists. Only a select few have ever flown into space. It has been nearly 57 years since the first man went into space (April 12th). One source lists the 519 people who have ever traveled to space. When was the last time someone left Sunny Florida for space? 2011? Still hoping for the next flight? When?

It took 10 years to go from first flight (Wright Brothers) to first passenger flights (Saint Petersburg to Kiev in Russia and St. Petersburg to Tampa in Florida). There were many failures along the way but the industry was built on people willing to risk their own lives (most people killed in plane crashes are on the planes).

Vehicle accidents kill non-drivers. Pedestrians, bicyclists and motorcyclists are killed due to the carelessness of drivers of larger vehicles (cars and trucks). Self driving cars are not putting just their occupants at risk - they are putting the lives of every person they encounter at risk. Trusting that the system will accurately tell the difference between an overhead sign and a semi turned across the road, that the system will see a faded lane divide or a pedestrian on a darkened street regardless of if they should be there or not.

Perhaps you are one of those Tesla drivers who sticks their feet out of the window while the car drives or reads a book. I hope not. The system isn't ready for that level of use (and I could perform the same stupidity in any vehicle with cruise control - so no credit goes to the "self drive" to perform stupid tricks). I don't do such stupid things because I'm not ready to risk my or fellow road user's lives.

I believe a lot of people dream of a future with autonomous pods delivering people and packages safely to their destinations. Anyone who has trouble finding a good parking slot at work would probably love to have a car that would drop them off at the door, go park somewhere and come back when they were ready to leave. Fiction is full of examples of futuristic travel options. The trouble is with seeing this in our immediate future. We will not be seeing mass deployment outside of test environments in three to five years.

It is great to have a dream ... but one should also have a grasp on reality.


----------



## SamC (Jan 20, 2003)

NR4P said:


> Imagine if we stopped the Apollo missions after Apollo 1 burned up on the launchpad. Or Challenger blew up in the sky. And yes, early airplanes with people on them crashed.


Well in a way, we did. You list 3 inventions that worked. Do some basic research at a good college library about any of those eras, starting with publications like Popular Science or Popular Mechanics, and then go deeper into more academic publications, or lighter into things like Motor Trend (Is 1969 the year for electric cars is a famous 1967 cover story) or even Time (the moon landing issue has a great artists conception of the certain by 1976 moon base) and you will see 10,000 other ideas that simply did not work.

"If they can put a man on the moon, then..." is a truism, not a logical statement.

Not everything you can imagine can be invented.



James Long said:


> Apollo and Challenger killed volunteers who knew they were putting their lives on the line.
> 
> Vehicle accidents kill non-drivers.
> 
> ...


Excellent point. The idea that this stuff is being "tested" on the public at large should upset more people.


----------



## yosoyellobo (Nov 1, 2006)

SamC said:


> Well in a way, we did. You list 3 inventions that worked. Do some basic research at a good college library about any of those eras, starting with publications like Popular Science or Popular Mechanics, and then go deeper into more academic publications, or lighter into things like Motor Trend (Is 1969 the year for electric cars is a famous 1967 cover story) or even Time (the moon landing issue has a great artists conception of the certain by 1976 moon base) and you will see 10,000 other ideas that simply did not work.
> 
> "If they can put a man on the moon, then..." is a truism, not a logical statement.
> 
> ...


Maybe they test it on the moon with no people.


----------



## billsharpe (Jan 25, 2007)

I'm not ready to travel in a self-driving car yet. Lots more testing is needed.


----------



## James Long (Apr 17, 2003)

Uber disabled emergency braking in self-driving car: U.S. agency

Uber had disabled an emergency braking system in a self-driving vehicle that struck and killed a woman in Arizona in March even though the car had identified the need to apply the brakes, the National Transportation Safety Board said in a preliminary report released on Thursday.

The report into the first fatal crash caused by a self-driving vehicle also disclosed that the modified 2017 Volvo XC90's radar systems observed the pedestrian six seconds before impact but "the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path."

At 1.3 seconds before impact, the self-driving system determined emergency braking was needed. But Uber said, according to the NTSB, that automatic emergency braking maneuvers in the Volvo XC90 were disabled while the car was under computer control in order to "reduce the potential for erratic vehicle behavior."


----------



## Nick (Apr 23, 2002)

James Long said:


> Uber disabled emergency braking in self-driving car: U.S. agency
> 
> ...But Uber said, according to the NTSB, that *automatic emergency braking maneuvers in the Volvo XC90 were disabled while the car was under computer control* in order to "reduce the potential for erratic vehicle behavior."


Left-hand, right-hand?

Sheesh!

That makes as much sense as my car disabling the horn when I play the radio. No sense whatsoever!


----------



## James Long (Apr 17, 2003)

A quick calculation ... on dry pavement a car going 39 MPH would need 72 ft to stop. A car going 39 MPH would travel 85 ft in 1.5 seconds (the time that the emergency brake should have engaged). That is 13 ft too far but may have made the accident survivable. Add in a swerve and the pedestrian could have been avoided.


----------

