# First Ground Up Driverless Vehicle To Be On Road in 2015



## Drucifer

*Google Self-Driving Car Project*












> The company expects the vehicle to be hitting the streets of California some time in 2015.
> 
> GOOGLE


SOURCE


----------



## yosoyellobo

I would sign up tomorrow if Google assume most of the responsibility in the form of insurance.


----------



## satcrazy

This makes everyone a backseat driver?


----------



## Drucifer

satcrazy said:


> This makes everyone a backseat driver?


I'm fine with that as I enjoy looking out the windows.


----------



## yosoyellobo

I have never been able to read in a moving vehicle so I hope I could watch movies otherwise it be sleeping and listening to music.


----------



## Laxguy

Maybe I'm too much of a control freak, but I don't see myself using it even when/if/as it becomes widespread. 

Funny- I initially read "ground-up" as in going through one of those huge metal shredding machines.


----------



## SeaBeagle

Why would this vehicle be ground up? Well I guess there is one reason because the design is so ugly. Only thing to do is to grind this vehicle up.


Sent from my iPad 4 128GB using DBSTalk mobile app


----------



## 4HiMarks

Put one of those out on I-95 at rush hour and it will be sure to be "ground up" on the road by a semi in no time.


----------



## yosoyellobo

4HiMarks said:


> Put one of those out on I-95 at rush hour and it will be sure to be "ground up" on the road by a semi in no time.


It would be interesting to see how they are planing to deal with drivers road rage and unruly drivers.


----------



## Stewart Vernon

yosoyellobo said:


> It would be interesting to see how they are planing to deal with drivers road rage and unruly drivers.


They could program a road-rage mode where the computer will respond with tailgating maneuvers and other antagonistic behaviors to make the human drivers feel comfortable. They could also build outward facing displays into the windows so images of the computer flipping people off can be displayed as well


----------



## yosoyellobo

yosoyellobo said:


> I would sign up tomorrow if Google assume most of the responsibility in the form of insurance.


Sign me up. http://www.bbc.com/news/technology-34475031


----------



## phrelin

yosoyellobo said:


> Sign me up. http://www.bbc.com/news/technology-34475031





> In a speech in Washington DC on Thursday, the president of Volvo Cars, Hakan Samuelsson, said that the US is currently "the most progressive country in the world in autonomous driving".
> 
> However, he believes it "risks losing its leading position" because of the lack of Federal guidelines for the "testing and certification" of autonomous vehicles.


From Dictionary.com:










I'm not comfortable with the concept of an autonomous car, which as near as I can tell means "subject to its own laws" and "not subject to control from outside", you know, sort of "functioning as an independent organism." But then again, I have been accused of being paranoid about some technology.


----------



## dpeters11

phrelin said:


> I'm not comfortable with the concept of an autonomous car, which as near as I can tell means "subject to its own laws" and "not subject to control from outside", you know, sort of "functioning as an independent organism." But then again, I have been accused of being paranoid about some technology.


Honestly, I'm more comfortable with the car driving itself than distracted drivers and such.


----------



## Tom Robertson

Google's statistics have been phenomenal. Some multiple millions of miles driven and something like 20 accidents--all the fault of the other driver. (Most of them where the google car was rear-ended or side hit.)

I'm ready--sign me up too. 

Peace,
Tom


----------



## SeaBeagle

Drucifer said:


> *Google Self-Driving Car Project*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SOURCE


Why is thus vehicle going to be ground up?

Sent from my iPad 4 128GB using DBSTalk mobile application.


----------



## Drucifer

For most of my driving I would like the vehicle to do the driving.


----------



## Drucifer

I read somewhere, that there is a tractor-trailer now being tested on real roads.


----------



## James Long

SeaBeagle said:


> Why is thus vehicle going to be ground up?


Previous driverless vehicles have been conversions ... take an existing car and equip it with special controls. That car (and others) are designed to be driverless from the day they are first assembled. Not conversions.


----------



## inkahauts

You know they've had se driving cars for many years in movies and such. And they have often needed track marks and such in roads to make then work in scifi movies. But in reality they won't need anything at all to be done to our exsist get roads. Kinda crazy when you think about it really.


----------



## phrelin

dpeters11 said:


> Honestly, I'm more comfortable with the car driving itself than distracted drivers and such.


I'm ok with a car "driving itself" in the sense that means we get auto braking, accelerating and decelerating based on conditions, self-parking, and steering a pre-established track. But when tech folks start talking about an "autonomous" car I think of...










... which offers some amusement and ...










... which gives me a moment's pause.









The discussion here turned to who is going to regulate those "autonomous" vehicles which branched out into insurance based on an article about what a European manufacturer wants from American government.

What I see is the auto industry aligning itself with the tech industry to lobby Congress to take "on-the-road" regulations away from the states. Among other things_ owners_ now obtain liability insurance for their cars and are ultimately liable. If a self-driving algorithm fails perhaps the federal safety standards will lead to a recall but I would hate to depend on getting money out of an auto manufacturer's insurer as regulated by Congress.

As I have said many times, I may be paranoid but that doesn't mean they aren't out to get me. In this case we have an executive from Volvo, a European company and maker of Mack Trucks, on behalf of the auto industry floating the idea that America is at risk of losing its lead position because "car makers face inconsistent rules from state to state, which makes it harder to roll out their technology."

What I hope is that the use of the word "autonomous" was just a language problem, but forgive me if I am suspicious of multi-national corporations with $$$$ in their eyes fantasizing about their technology.

And I won't even get into what information the tracking systems will feed directly to government agencies or your friendly neighborhood car dealer. And then, of course, there is what hackers will do followed by the "we've fixed the problem" a day or two late. We've already seen the "oops" situation in current American car computer technology.


----------



## dpeters11

I think more of KITT, that's the car I want. I actually went into a Tesla showroom showroom when I got in, I made a comment that it looked like Darth Vader's bathroom, the sales guy didn't pick up on the reference. 

I want KITT, maybe without the weapons, but to your point, I definitely don't want KARR.

The problem seen in some of these cars is a general lack of thinking of security and firewalls. Fairly basic stuff. I think Tesla is doing it better than most. And the fact that they can update the software Wirelessly is huge. I am assuming that they have the checks and balances to prevent rogue updates. 

Sent from my Z30 using Tapatalk


----------



## inkahauts

phrelin said:


> I'm ok with a car "driving itself" in the sense that means we get auto braking, accelerating and decelerating based on conditions, self-parking, and steering a pre-established track. But when tech folks start talking about an "autonomous" car I think of...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ... which offers some amusement and ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ... which gives me a moment's pause.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The discussion here turned to who is going to regulate those "autonomous" vehicles which branched out into insurance based on an article about what a European manufacturer wants from American government.
> 
> What I see is the auto industry aligning itself with the tech industry to lobby Congress to take "on-the-road" regulations away from the states. Among other things_ owners_ now obtain liability insurance for their cars and are ultimately liable. If a self-driving algorithm fails perhaps the federal safety standards will lead to a recall but I would hate to depend on getting money out of an auto manufacturer's insurer as regulated by Congress.
> 
> As I have said many times, I may be paranoid but that doesn't mean they aren't out to get me. In this case we have an executive from Volvo, a European company and maker of Mack Trucks, on behalf of the auto industry floating the idea that America is at risk of losing its lead position because "car makers face inconsistent rules from state to state, which makes it harder to roll out their technology."
> 
> What I hope is that the use of the word "autonomous" was just a language problem, but forgive me if I am suspicious of multi-national corporations with $$$$ in their eyes fantasizing about their technology.
> 
> And I won't even get into what information the tracking systems will feed directly to government agencies or your friendly neighborhood car dealer. And then, of course, there is what hackers will do followed by the "we've fixed the problem" a day or two late. We've already seen the "oops" situation in current American car computer technology.


I hadn't heard about the push for same laws everywhere. What I want to know is what laws he's talking about. I need examples! Because I don't think they are at all different really state to state for the major things. But even small towns have to have different laws because of unique things in their area.

For example all peacocks have the right away in Arcadia California. If you're in a car you must yield to them. I seriously doubt this is something thats even listed on anybody else's radar. And there's a reason for that.

If you have a link showing examples of what this guy was talking about I would love to see it thanks!


----------



## phrelin

inkahauts said:


> I hadn't heard about the push for same laws everywhere. What I want to know is what laws he's talking about. I need examples! Because I don't think they are at all different really state to state for the major things. But even small towns have to have different laws because of unique things in their area.
> 
> For example all peacocks have the right away in Arcadia California. If you're in a car you must yield to them. I seriously doubt this is something thats even listed on anybody else's radar. And there's a reason for that.
> 
> If you have a link showing examples of what this guy was talking about I would love to see it thanks!


I wish I did. His news release on the speech quoted in the article is short on details and long on polemics. I did another search and this subject came up right under news article results for Volkswagon....


----------



## dpeters11

Some states allow turning left on red, sometimes it has to be from a one way to a one way, others allow it from a two way to a one way. Some places it's banned altogether. That might be one example, but honestly, it (at least in Ohio) comes up so infrequently that you probably can just put in the logic to never turn left on red and be fine.


----------



## Tom Robertson

The news release that Phrelin posted mentions the certification process. Imagine 50 different requirements to be certified. 

The other area that comes to mind is liability. Right now each state sets the minimum liability requirements.

Peace,
Tom


----------



## Drucifer

phrelin said:


> I'm ok with a car "driving itself" in the sense that means we get auto braking, accelerating and decelerating based on conditions, self-parking, and steering a pre-established track. But when tech folks start talking about an "autonomous" car I think of...
> 
> ... which offers some amusement and ...
> 
> ... which gives me a moment's pause.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The discussion here turned to who is going to regulate those "autonomous" vehicles which branched out into insurance based on an article about what a European manufacturer wants from American government.
> 
> What I see is the auto industry aligning itself with the tech industry to lobby Congress to take "on-the-road" regulations away from the states. Among other things_ owners_ now obtain liability insurance for their cars and are ultimately liable. If a self-driving algorithm fails perhaps the federal safety standards will lead to a recall but I would hate to depend on getting money out of an auto manufacturer's insurer as regulated by Congress.
> 
> As I have said many times, I may be paranoid but that doesn't mean they aren't out to get me. In this case we have an executive from Volvo, a European company and maker of Mack Trucks, on behalf of the auto industry floating the idea that America is at risk of losing its lead position because "car makers face inconsistent rules from state to state, which makes it harder to roll out their technology."
> 
> What I hope is that the use of the word "autonomous" was just a language problem, but forgive me if I am suspicious of multi-national corporations with $$$$ in their eyes fantasizing about their technology.
> 
> And I won't even get into what information the tracking systems will feed directly to government agencies or your friendly neighborhood car dealer. And then, of course, there is what hackers will do followed by the "we've fixed the problem" a day or two late. We've already seen the "oops" situation in current American car computer technology.


Where's Knight?


----------



## James Long

dpeters11 said:


> Some states allow turning left on red, sometimes it has to be from a one way to a one way, others allow it from a two way to a one way. Some places it's banned altogether. That might be one example, but honestly, it (at least in Ohio) comes up so infrequently that you probably can just put in the logic to never turn left on red and be fine.


Somehow my GPS knows the speed limits on most roads and has been programmed with intersections where turns are permitted (or not). While I have seen worse GPSs, I suspect that the car would have a better chance than I would to know the laws of all 50 states.

There are a lot of variables to program - but I am more concerned that the car would stop for a pedestrian in a crosswalk when and only when needed (are they in the crosswalk, waiting to cross behind the car or waiting for a bus?).

Autopilot cars remind me of my favorite Star Trek technology - one I have not seen mastered. Doors. Yes, we have automatic doors that open when one steps on a mat or (more common today) detects motion but they are far from perfect. The Star Trek doors recognized who was there and seemed to know where people were going and what doors needed opened. Consider the algorithm needed to open a door when a person wants to pass through it but not when a person wants to walk by or stand outside of the door. The automation must know intent.

Imagine a situation where a person comes to visit, they walk up to the door and it does not open (it is polite to knock or ring the bell - or perhaps the automation detects that the visitor wants to alert the person in the room and rings the bell). The person in the room walks up to the door ... what happens? Does the door open so the person in the room can greet the guest? Or does it remain shut until the system determines (somehow) that the person in the room wants to open the door. Does the bell announce who the visitor is? "Sam is at the door." Does the system know the relationships and take that into account when deciding to open the door?

Scenario: Sam comes to visit Sally. They are friends but the system determines not to open the door for Sam. It alerts Sally that Sam is at the door. Sally steps out of the shower and walks over to the door to let Sam know that it will be a few minutes. Does the system open the door since Sally approached it? How does the system know Sally's intent? A ton of data and a good guess? And while Sally may be embarrassed if the door technology misfires and she is revealed naked in front of Sam too early in the relationship, if we can't get doors right can we get cars right?


----------



## phrelin

James Long said:


> if we can't get doors right can we get cars right?


 :righton:


----------



## Stewart Vernon

When I get into a taxi, or a bus, if that vehicle gets into an accident I'm not liable. I'm paying/hiring the ride but the driver (and his employer) are liable for any accidents. Before I would even set foot inside an "autonomous" car that same standard would have to be met. If I'm not in control of the vehicle operation, then I'm not going to be liable.

After that... I find it ironic that the safety is touted because of taking away "human error" except humans are designing and programming it... and making it OR at least making the things that make the car... so what are the plans to eliminate "human error" there? At least human error in the vehicle operation is a dynamic thing where you have a chance to correct on the fly... there will be no on-the-fly correction for a manufacturing or programming error.

I'm not afraid of technology... but it isn't flawless either.


----------



## yosoyellobo

This remains me of an argument I use to have with my boss about the use of a life jacket on a boat. I use to say that on a small boat I would prefer to always wear one all the time and he would not wear one at all because he knew of a case were someone die because the life jacket he was wearing got tangle up with something and he when down the ship. If a life jacket could save my life I would use it. Same with a self driving car. Either way stuff happens.


----------



## Tom Robertson

Stewart Vernon said:


> When I get into a taxi, or a bus, if that vehicle gets into an accident I'm not liable. I'm paying/hiring the ride but the driver (and his employer) are liable for any accidents. Before I would even set foot inside an "autonomous" car that same standard would have to be met. If I'm not in control of the vehicle operation, then I'm not going to be liable.
> 
> After that... I find it ironic that the safety is touted because of taking away "human error" except humans are designing and programming it... and making it OR at least making the things that make the car... so what are the plans to eliminate "human error" there? At least human error in the vehicle operation is a dynamic thing where you have a chance to correct on the fly... there will be no on-the-fly correction for a manufacturing or programming error.
> 
> I'm not afraid of technology... but it isn't flawless either.


Perhaps the difference is the timing and frequency of the human error. When a human is in control, he/she is constantly subjected to choices, situations, inputs, etc. The computer systems have already tested and simulated against all those situations and inputs, and already have tested the most correct choices. The human error in the computer systems relate to the testing and simulation. After a couple million miles, most of those human errors have been driven out.

Remember--the cars we drive today are designed by humans. There are human errors in them. Yet most of us drive them anyway. 

Peace,
Tom


----------



## James Long

There will still be humans on the road making errors that the computer will have to predict and avoid. Every time I have seen a "fool proof" computer humanity seems to come up with a more robust fool. I expect a lot of timid driving from the driverless cars.

One thing no one wants to see from their car:


----------



## Drucifer

Stewart Vernon said:


> [SNIP]
> 
> I'm not afraid of technology... but it isn't flawless either.


While it might not be flawless, it's current driving ability is how much better than a human?

The Cali vehicle got into about 10 accidents, all the fault of the other vehicle being driven by a human.

So while it ain't flawless, I'll take a road full of computer driven vehicles over the human alternative anytime.

Why? Because I will feel a million times safer.


----------



## Drucifer

Tom Robertson said:


> Perhaps the difference is the timing and frequency of the human error. When a human is in control, he/she is constantly subjected to choices, situations, inputs, etc. The computer systems have already tested and simulated against all those situations and inputs, and already have tested the most correct choices. The human error in the computer systems relate to the testing and simulation. After a couple million miles, most of those human errors have been driven out.
> 
> Remember--*the cars we drive today are designed by humans. There are human errors in them. Yet most of us drive them anyway.*
> 
> Peace,
> Tom


There was a recent study I saw that stated the safer the roads are constructed, the more reckless we seem to drive on them.


----------



## Christopher Gould

What happens when the computer is given two choices to avoid an accident. One way possibly kills you or possibly kills someone else. What does it do?


----------



## yosoyellobo

Christopher Gould said:


> What happens when the computer is given two choices to avoid an accident. One way possibly kills you or possibly kills someone else. What does it do?


The army is working on that one.


----------



## Tom Robertson

Christopher Gould said:


> What happens when the computer is given two choices to avoid an accident. One way possibly kills you or possibly kills someone else. What does it do?


Doesn't sound like choices to "avoid an accident."

From the videos I've seen by Google, they anticipate behaviors and events earlier than humans can even see them. And tracks more items, with greater accuracy.

Peace,
Tom


----------



## James Long

Sophie's choice.


----------



## phrelin

All computers and all cars are designed by humans, at least in the context we're discussing them. What I'm concerned about are the simple things - what happens when a skunk sprays the sensor surface just after the car avoids hitting it? I don't drive day in and day out on freeways - I drive on country and town roads, some of them gravel and dirt. I simply don't believe they are anywhere near close enough to turn an "autonomous" car loose on our local roads and I'm not sure they will ever be.


----------



## Stewart Vernon

Tom Robertson said:


> Perhaps the difference is the timing and frequency of the human error. When a human is in control, he/she is constantly subjected to choices, situations, inputs, etc. The computer systems have already tested and simulated against all those situations and inputs, and already have tested the most correct choices. The human error in the computer systems relate to the testing and simulation. After a couple million miles, most of those human errors have been driven out.


Perhaps... but if there is a flaw, the computer will not be able to correct for that flaw when it comes up and I would be helpless as well. Meanwhile, when I as a human driver in control of the car encounter something new, I have a chance to improvise. And perhaps as important, if I'm going to be held responsible I want control of the vehicle. IF the company making the car takes all responsibility, then we can talk about the other stuff.



Drucifer said:


> While it might not be flawless, it's current driving ability is how much better than a human?
> 
> The Cali vehicle got into about 10 accidents, all the fault of the other vehicle being driven by a human.
> 
> So while it ain't flawless, I'll take a road full of computer driven vehicles over the human alternative anytime.
> 
> Why? Because I will feel a million times safer.


I dunno. I've been driving for nearly 30 years now. In all that time I've been part of 4 accidents. Three of those I was rear-ended at a stop light. Literally no way for me to avoid those! The other accident was a no-fault accident where I was crossing an intersection and someone suddenly decided to pass around the car that had stopped to wave me across. So while I could have avoided that one by just waiting longer to cross, the accident itself was an unexpected change in the situation once I had committed to crossing the road.

So, worst case, 1 accident in 30 years was partially my fault... and only 3 others... That seems way better to me than those autonomous cars, so I'll take my driving experience over theirs!


----------



## Drucifer

Stewart Vernon said:


> I dunno. I've been driving for nearly 30 years now. In all that time I've been part of 4 accidents. Three of those I was rear-ended at a stop light. Literally no way for me to avoid those! The other accident was a no-fault accident where I was crossing an intersection and someone suddenly decided to pass around the car that had stopped to wave me across. So while I could have avoided that one by just waiting longer to cross, the accident itself was an unexpected change in the situation once I had committed to crossing the road.
> 
> So, worst case, 1 accident in 30 years was partially my fault... and only 3 others... That seems way better to me than those autonomous cars, so I'll take my driving experience over theirs!


Well, I've been driving 50+ years. Had one accident when I had a drivers permit.

I still prefer all the vehicles on the road to be computer driven. Because computers don't make forgetful mistakes.


----------



## James Long

There are a lot of cars on the road I'd prefer to be computer driven. Most of them are barely being driven at all ... the driver seems to be doing something else. There have been many times where I've nearly been hit by someone not paying attention - and one time where nearly did not apply (I stopped for a line of traffic, he did not - no injuries other than his pride but over $1000 in damages). For people who are already not driving, let their car do a better job.

As for me ... I'd rather drive, thank you very much. There are times where I would not mind turning over the wheel to the car - if the law allows that. Will there be laws against texting or cellphones or laptop use or television or sleeping when the car is driving? As long as I can't do anything else I might as well drive.

Driverless cars are perfect for people who don't drive. Perhaps they will become the next Uber. A car you do not own takes you from where you are to where you want to go and then moves on to the next person.


----------



## scooper

The wife and me were discussing this - We're more of the mind that there are times it would be nice to put the car into "autopilot mode", especially on a long trip, but for day to day commutting - no way no how. Things change rapidly out there, and I'm not all that trusting of automation for life / mission critical things that may require judgement.


----------



## Tom Robertson

Stewart Vernon said:


> Perhaps... but if there is a flaw, the computer will not be able to correct for that flaw when it comes up and I would be helpless as well. Meanwhile, when I as a human driver in control of the car encounter something new, I have a chance to improvise. And perhaps as important, if I'm going to be held responsible I want control of the vehicle. IF the company making the car takes all responsibility, then we can talk about the other stuff.
> 
> I dunno. I've been driving for nearly 30 years now. In all that time I've been part of 4 accidents. Three of those I was rear-ended at a stop light. Literally no way for me to avoid those! The other accident was a no-fault accident where I was crossing an intersection and someone suddenly decided to pass around the car that had stopped to wave me across. So while I could have avoided that one by just waiting longer to cross, the accident itself was an unexpected change in the situation once I had committed to crossing the road.
> 
> So, worst case, 1 accident in 30 years was partially my fault... and only 3 others... That seems way better to me than those autonomous cars, so I'll take my driving experience over theirs!


What "improvisation" could you possibly handle better than a system that has driven thousands of miles for every mile you've driven? You can only learn from your experiences, computer driven cars learn from each other. They would see the situation you would have to improvise before you saw it--they wouldn't need to improvise. 

They would track all the objects that are meaningful and none of the ones that aren't (like things in the car--phones, people, movies, radio, etc.)

Yes, they limited to preventing read-enders. But the computers wouldn't rear end in the first place. 

You clearly are an exceptional driver. What will you do when your insurance company charges you more because you choose to drive rather than let the computer? Eventually--a lot more.

Peace,
Tom


----------



## phrelin

scooper said:


> The wife and me were discussing this - We're more of the mind that there are times it would be nice to put the car into "autopilot mode", especially on a long trip, but for day to day commutting - no way no how. Things change rapidly out there, and I'm not all that trusting of automation for life / mission critical things that may require judgement.


I might buy a car that could be set to an autopilot mode or not, whichever the owner chooses, rather than the "autonomous" car the Swedish guy from Volvo seemed to be talking about (whose first language is not English).


----------



## James Long

Tom Robertson said:


> You can only learn from your experiences, computer driven cars learn from each other.


I can learn from the experience of others. That started decades ago in driver's ed watching horrible crash videos (along with videos of people doing it right). The in car training was done with three students in the car with each trainer ... the other two students could learn from the driver's mistakes when it was not their turn to drive. Not to mention the time between driver's ed and getting a regular license where I had to drive with a licensed adult. Not to mention every trip in a car since I was aware that cars were driven ... with the opportunity to watch my parents and other drivers. Not to mention YouTube and other social media sharing of videos and descriptions of people driving. (A lot of unmentionables.)

If you are not learning from the experience of others then you are simply choosing to be ignorant.

What a computer driven car learns is filtered by a human being. The scenarios that the computer has been trained to react to are based on the decisions of the programmer. If the programmer is wrong the car is wrong.

This isn't chess where all the legal moves can be programmed then let the computer run through every possible game and then pattern match a victory based on all the winning end games from the current location of pieces on the board. Driving is a game where people cheat. They run red lights, they turn right on red when not permitted or without stopping. They pass on the right using right turn only lanes. They speed. They fail to yield. They make unsafe lane changes. They fail to signal. They drive on the wrong side of the road. Etc etc etc ... the list goes on.

It is not good enough to say "Ah, the human cheated. The human was at fault. Don't blame the computer car." It doesn't matter who is at fault when you're dead or injured. You're still dead or injured.

For success the car needs to be sentient. It needs to have a sixth sense about the billions of possibilities and avoid every problem that it can. We're not to that level of technology. We don't even have computer doors that work properly 100% of the time. Siri and other helpers on our phones can't even translate audio correctly (when I say "thank you" as clear as possible then see "f*** you" appear on the screen it is time to turn off the automation).

Yes, I am setting a high bar. But I am talking about human lives. Humans should not be killed by their machines.


----------



## Tom Robertson

James Long said:


> I can learn from the experience of others. That started decades ago in driver's ed watching horrible crash videos (along with videos of people doing it right). The in car training was done with three students in the car with each trainer ... the other two students could learn from the driver's mistakes when it was not their turn to drive. Not to mention the time between driver's ed and getting a regular license where I had to drive with a licensed adult. Not to mention every trip in a car since I was aware that cars were driven ... with the opportunity to watch my parents and other drivers. Not to mention YouTube and other social media sharing of videos and descriptions of people driving. (A lot of unmentionables.)
> 
> If you are not learning from the experience of others then you are simply choosing to be ignorant.


My point is you are still learning via your experiences even if they are via means other than driving. Not everyone will see the same videos you did. Or have the same training experiences your group had. It's still your experiences.

All google cars can learn from all other google cars.



James Long said:


> What a computer driven car learns is filtered by a human being. The scenarios that the computer has been trained to react to are based on the decisions of the programmer. If the programmer is wrong the car is wrong.
> 
> This isn't chess where all the legal moves can be programmed then let the computer run through every possible game and then pattern match a victory based on all the winning end games from the current location of pieces on the board. Driving is a game where people cheat. They run red lights, they turn right on red when not permitted or without stopping. They pass on the right using right turn only lanes. They speed. They fail to yield. They make unsafe lane changes. They fail to signal. They drive on the wrong side of the road. Etc etc etc ... the list goes on.
> 
> It is not good enough to say "Ah, the human cheated. The human was at fault. Don't blame the computer car." It doesn't matter who is at fault when you're dead or injured. You're still dead or injured.


Actually the filters are not human. They are self-learning systems that do their own filtering. The overall meta-level guidance is human--"keep the car from crashing." 

And the options for each set of circumstances are not totally limitless. Physics does still rule the movements of objects. 

So the systems are able to recognize objects, know how they can behave, how they likely will behave, and how to tell if they are "cheating" as you put it. Cheating is merely another set of possibilities--which the computer will recognize faster than a human would.



James Long said:


> For success the car needs to be sentient. It needs to have a sixth sense about the billions of possibilities and avoid every problem that it can. We're not to that level of technology. We don't even have computer doors that work properly 100% of the time. Siri and other helpers on our phones can't even translate audio correctly (when I say "thank you" as clear as possible then see "f*** you" appear on the screen it is time to turn off the automation).
> 
> Yes, I am setting a high bar. But I am talking about human lives. Humans should not be killed by their machines.


Doors are easy--if you program your destination into the system. Since cars will have the knowledge of your destination, the correct doors can open.

I'm not really ignoring your Star Trek door example. Yet I am saying the door example requires the same inputs for driving a car--namely destination. Though doors could do a lot with your direction of travel. If you aren't headed directly at the door, you likely aren't trying to cross that threshold.  (The who can enter without knocking is a simple access control list, by the way.)

Peace,
Tom


----------



## James Long

I'm sorry. I cannot take you seriously when your posts are peppered with similes.

If you want to put your life in the hands of an autonomous machine have at it ... just stay away from me.


----------



## yosoyellobo

phrelin said:


> I might buy a car that could be set to an autopilot mode or not, whichever the owner chooses, rather than the "autonomous" car the Swedish guy from Volvo seemed to be talking about (whose first language is not English).


i be perfectly happy to drive in autopilot mode and let manufacturer handle the liability. Giving the driver the ability to turn off autopilot opens up a can of worms for the manufactures and frankly would not make sense for them.


----------



## Tom Robertson

James Long said:


> I'm sorry. I cannot take you seriously when your posts are peppered with similes.
> 
> If you want to put your life in the hands of an autonomous machine have at it ... just stay away from me.


We put our lives in the "hands" of technology all the time. If you want, you can go back to stone knives and bearskins. Doesn't sound like fun to me. 

As for driving near you... you'll be driving near many autonomous cars fairly soon. They will be safer than the human driving cars--already are, actually. So you also get to choose if you wish to be on the roads or not.

Peace,
Tom

Oh crap. I used another smilie. I guess I don't want to be grumpy tonight.


----------



## Tom Robertson

yosoyellobo said:


> i be perfectly happy to drive in autopilot mode and let manufacturer handle the liability. Giving the driver the ability to turn off autopilot opens up a can of worms for the manufactures and frankly would not make sense for them.


Me too. 

Right now, the hardest problems are snow covered roads and some gravel roads, as I understand it. There will be times when manual drive will be required.

So the insurance industry will have an interesting time developing products what account for how often you drive in manual vs. let the computer drive. And insurance companies will utilize more data from the computer systems to determine "fault" and "liability" from accidents. At some point, and I suspect it could come pretty soon, it will be relatively expensive to insure a manual only car or a car that is frequently driving in manual mode.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> We put our lives in the "hands" of technology all the time.


Human guided technology ... typically with the humans present and often with the humans sharing the danger.



Tom Robertson said:


> As for driving near you... you'll be driving near many autonomous cars fairly soon.


No, I won't. Only one state near me has made autonomous cars explicitly legal - and only for testing. (The argument that autonomous cars have not been ruled illegal therefore they are legal everywhere has not been tested.) There will not be enough of them on the road "fairly soon" to notice. My best chance of seeing one would be on a trip to Chicago. I'd like to see how one of those cars handles the imperfections of Chicago traffic ... especially when limited to zero violations of any traffic law. Autonomous cars jamming up traffic ... that will be the headline.

I am seeing more places where flex-fuel vehicle drivers can buy something more than normal unleaded gasoline. And I have seen electric car charging stations (other than on the Internet or TV). Adopting new technology takes time ... and money. "Many autonomous cars fairly soon"? No.

There is a better chance that AT&T will shut down uverse and try to force all of their subs to get DirecTV (or vice versa) than "many autonomous cars fairly soon" where I drive. Your experience may vary.

I do not agree with your fear uncertainty and doubt about insurance rates going up for non-autonomous vehicles (more than the expected increases the industry would foist on us in any case). My insurance company does offer discounts for features that lower claims and save them money: airbags, theft deterrents, anti-lock brakes, safe driving, choice of vehicle (popularity to be stolen or expense of repair), age of drivers, marital status and other statistics all affect insurance rates. There is not enough data for autonomous vehicles to offer a discount.


----------



## dennisj00

You can buy a Tesla today (at least order one) that drives itself at cruise control speeds so I don't think the state by state legality issue is valid. It does take input from you - flick the turn signal - to change lanes. And it will park itself in your garage or a public parking space.

I'm lusting after one.

Some of these posts remind me of my grandfathers stories that people didn't trust automobiles and would rather keep their horses and buggy whips.

I'm an early adopter for many things and I think we won't worry about our cars driving themselves in 5 to 10 years.


----------



## Stewart Vernon

Drucifer said:


> I still prefer all the vehicles on the road to be computer driven. Because computers don't make forgetful mistakes.


Sure they do... We have a Web site here partially devoted to computer mistakes (Dish and DirecTV receivers)!

Computers are built and programmed by people. Those people have flaws. Airplanes can have autopilots because by and large there isn't much up there at any given time to run into... so the plane can pretty much go in a straight line or a pre-defined curve without worry of collision or "running of the road." Cars have a lot more obstacles... and while people aren't perfect, neither are the computers they create.

A computer might be able to ensure the mistakes are consistent ones... unless there are bugs in the programming. There are always bugs that produce unexpected results.



Tom Robertson said:


> What will you do when your insurance company charges you more because you choose to drive rather than let the computer? Eventually--a lot more.


That's part of the rub. IF I'm in a computer-driven car I shouldn't be required to have insurance in the first place! I don't need insurance to ride in a taxi or on a bus... the driver of that vehicle has insurance. IF I'm in an autonomous car that I can't drive, I better not be held liable and I better not be required to have insurance for it. So it shouldn't be a choice of paying more or less for insurance, it should be a choice of driving and having insurance OR riding and not having any.


----------



## Tom Robertson

Stewart Vernon said:


> ...
> 
> A computer might be able to ensure the mistakes are consistent ones... unless there are bugs in the programming. There are always bugs that produce unexpected results.
> 
> That's part of the rub. IF I'm in a computer-driven car I shouldn't be required to have insurance in the first place! I don't need insurance to ride in a taxi or on a bus... the driver of that vehicle has insurance. IF I'm in an autonomous car that I can't drive, I better not be held liable and I better not be required to have insurance for it. So it shouldn't be a choice of paying more or less for insurance, it should be a choice of driving and having insurance OR riding and not having any.


These are great questions. One model would be a autonomous taxi system, where you wouldn't insure because you wouldn't own a car at all.

If you do own, you might not have to have liability but you still would be required to have comprehensive and uninsured motorist insurances. At least if you borrow money against it, the bank will require those insurances.

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> We put our lives in the "hands" of technology all the time. If you want, you can go back to stone knives and bearskins. Doesn't sound like fun to me.
> 
> As for driving near you... you'll be driving near many autonomous cars fairly soon. They will be safer than the human driving cars--already are, actually. So you also get to choose if you wish to be on the roads or not.
> 
> Peace,
> Tom
> 
> Oh crap. I used another smilie. I guess I don't want to be grumpy tonight.












If the car has enough sonar and radar to make decisions about these kind of situations, then I'll be interested. As it is, apparently the muddy flow in the foreground was ok to drive through, but the flow in the background was problematic. Drivers make mistakes. But that "autonomous" vehicle would likely not drive through _any_ such flow and if you couldn't shut off the autonomy, in my part of the world there would be many people who wouldn't get home many nights in the year. Add a waddling skunk to that picture and you're asking for trouble.


----------



## inkahauts

Except people have no Guinness driving trough a mud flow when they have no way of even knowing if the road is washed out. I don't get how that is even a question really. 

Personally I think these cars should all still have regular controls so you can take over if you want to at any time. Then you have the best of both worlds IMHO in the long run.


----------



## James Long

Good luck "autonomous car":















These photos were taken on a four lane divided highway. I suspect the autonomous car would shut down and say "I'm not driving in this weather" leaving it to the human to drive. The car company not wanting to be liable for the outcome.

Imagine being in a cab and having the driver say "you drive".

Will the car company accept responsibility in all weather? Or will manual override become so common that "autonomous" cars will be a joke?


----------



## inkahauts

Why? A car can tell speed differences in wheels and so forth better than humans. What I don't know is how it determines location in that kind of weather not safety of speed and going forward.


----------



## Tom Robertson

James Long said:


> Good luck "autonomous car":
> 
> 
> 
> 
> 
> 
> 
> 
> winter1.png
> 
> 
> 
> 
> 
> 
> 
> winter2.png
> 
> These photos were taken on a four lane divided highway. I suspect the autonomous car would shut down and say "I'm not driving in this weather" leaving it to the human to drive. The car company not wanting to be liable for the outcome.
> 
> Imagine being in a cab and having the driver say "you drive".
> 
> Will the car company accept responsibility in all weather? Or will manual override become so common that "autonomous" cars will be a joke?


From what I've read, snow covered (and some gravel) roads are a problem. Only in identifying lanes and edge of road. Eventually they'll solve that one too. Probably better than humans will be able to tell...

As for the driving, inkahauts is right. With proper sensors the cars will be safer with computers than humans. That said--good sensors will help humans too. 

Peace,
Tom


----------



## James Long

The more I drive the more I wonder "how would a computer handle that?". Assuming the computer will always follow the rules and the company will not want to take liability for any "gray area" maneuvers or difficult driving situations, I expect the "override" feature will be used often. The car companies will not want the liability.

What is the point of an autonomous car when it still needs a driver to take control?

There are a lot of legal issues to work out.


----------



## billsharpe

Allowing computers to drive cars will give new meaning to the words "computer crash." :nono2:

I have read that most accidents from driverless cars are caused by the other car. If we really want to avoid accidents all cars would have to be driverless.


----------



## James Long

billsharpe said:


> I have read that most accidents from driverless cars are caused by the other car.


I thought the claim was "all". By definition the driverless car is following the rules so it cannot be at fault. If the speed limit is 35 it will drive 35.0000 MPH or less regardless of what other vehicles are doing. And regardless of what would be safer. The car company would not want the responsibility of intentionally operating at 35.1 MPH (or higher). That would be a liability. Or driving one inch left of the center line of a narrow two lane road ... despite that being safer than having tires right on the edge of the road. And if there is an accident it is the road's fault or another driver's fault ... never the car's fault.


----------



## camo

I see way to many issues arising, like detours road construction, merge left, slow down to 15 mph etc. These cars will never replace the humans ability to reason or comprehend situations.
I would rather see efforts put into love bots :heart: at least if failure arises its no different than real relationships nor unexpected.


----------



## yosoyellobo

James Long said:


> I thought the claim was "all". By definition the driverless car is following the rules so it cannot be at fault. If the speed limit is 35 it will drive 35.0000 MPH or less regardless of what other vehicles are doing. And regardless of what would be safer. The car company would not want the responsibility of intentionally operating at 35.1 MPH (or higher). That would be a liability. Or driving one inch left of the center line of a narrow two lane road ... despite that being safer than having tires right on the edge of the road. And if there is an accident it is the road's fault or another driver's fault ... never the car's fault.


The car will more or less have the same leeway as anybody else. If everyone is going 36 MPH on a 35 MPH limit it will also go 36. If not I could see it become an early example a driverless car being destroy by an angry crowd fuel by road rage.


----------



## Tom Robertson

James Long said:


> The more I drive the more I wonder "how would a computer handle that?". Assuming the computer will always follow the rules and the company will not want to take liability for any "gray area" maneuvers or difficult driving situations, I expect the "override" feature will be used often. The car companies will not want the liability.
> 
> What is the point of an autonomous car when it still needs a driver to take control?
> 
> There are a lot of legal issues to work out.


Sounds like a lot of "Black/White" or "All or Nothing" thinking going on.

The presumption that all rules must be followed equally isn't realistic. For instance, "do not cross the center line" is not an absolute, there are times when situations require one to carefully cross the center line: turns, lane closures and obstructions, kids jumping out into the street, for example. The driving system knows how to prioritize the rules for safety.

I'm curious what other situations you've identified that the computer won't be taught to handle. Your first one, snow covering the lane and edge markers is a problem they have acknowledged. I know they will have a solution, though I don't know what it will be. A mixture of optical, laser, and radar sensors? Changes to the road surfaces in snowy climates? Even better GPS mapping?

As for the legal issues, yes there will be some. Google, Tesla, GM, and the other car makers will lobby for changes. As will the insurance institute. They've pushed for many of the current safety regulations.

Peace,
Tom


----------



## Tom Robertson

camo said:


> I see way to many issues arising, like detours road construction, merge left, slow down to 15 mph etc. These cars will never replace the humans ability to reason or comprehend situations.
> I would rather see efforts put into love bots :heart: at least if failure arises its no different than real relationships nor unexpected.


Navigation units already know how to handle road construction, detours, traffic slowdowns, etc. Those won't be a problem for autonomous cars.

How many driving situations are truly new, never ever seen before? You might not have seen them, yet after 1.2 million miles, google has likely seen something like it. 

And they are adding the equivalent of a years worth of driving each week. Now, from what i understand, they aren't driving on highways yet. So they have a large range of situations to experience to go. Though they may have had cars watching human drivers on highways already. I haven't seen anything regarding there highway simulations.

Peace,
Tom


----------



## phrelin

Sigh...



I didn't even get a chance to start a fight on the internet about "autonomous" but here's the definition again:










I would invite others to look at the Thesaurus.com listing of synonyms and antonyms for "autonomous." Perhaps some can explain to me why they would really want an "autonomous" car - a term synonymous with self-governing, sovereign, free, self-determining, self-ruling, uncontrolled, as opposed to subservient, dependent, subject.

Perhaps we should use some term like "unassisted" instead to reflect the state of the car's "operation", not its state of "being." Or maybe we actually mean to allow the cars on the road without even any supervision or control like other autonomous beings.


----------



## dennisj00

Y, it's probably the wrong word. Probably something like AutoDrive or AutoCruise or even AutoPilot would be better.


----------



## James Long

Tom Robertson said:


> Navigation units already know how to handle road construction, detours, traffic slowdowns, etc. Those won't be a problem for autonomous cars.


They key word is "avoid" ... and while one's navigation unit may say "you are on the quickest route" staying between the barrels and off of the bumpers of the cars around you is all up to the driver.

There will be some benefit to "self driving" cars ... but the pie in the sky all in buy in of the concept as if it will solve all of the world's problems is unrealistic. The reality is closer to "cruise control". A tool that will help in some situations ... not a be all end all solution that will take over driving to the point where nobody drives any more.

Humans are too autonomous to leave ALL of the driving to machines. And too litigious to have car companies take all of the responsibility for the car's operation.

It is illegal to drive 36 MPH in a 35 MPH zone. If I as a programmer tell my car that it is OK to violate the speed limit and set rules for when and where that can be done I open myself and my company up to liability. Any accident, whether caused by a car I programmed or not, where my car was speeding or crossed the center line or passed on a double yellow line or did anything else illegal will be blamed on my car and my programming. It doesn't matter if Google cars have 1.2 million miles of experience or 120 million miles when it comes down to liability. The car will be programmed to appease the lawyers.

Two lane road ... no passing zone ... bicycle or pedestrian in the lane ahead. The only legal solution is to slow to the obstruction's speed until the end of the no passing zone (or until the obstruction has left the roadway). Will your lawyers allow you to teach your car to violate a no passing zone so you can pass the obstruction? I doubt my lawyers will let me take that risk. "I cannot advise you to violate any law."


----------



## yosoyellobo

James Long said:


> They key word is "avoid" ... and while one's navigation unit may say "you are on the quickest route" staying between the barrels and off of the bumpers of the cars around you is all up to the driver.There will be some benefit to "self driving" cars ... but the pie in the sky all in buy in of the concept as if it will solve all of the world's problems is unrealistic. The reality is closer to "cruise control". A tool that will help in some situations ... not a be all end all solution that will take over driving to the point where nobody drives any more.Humans are too autonomous to leave ALL of the driving to machines. And too litigious to have car companies take all of the responsibility for the car's operation.It is illegal to drive 36 MPH in a 35 MPH zone. If I as a programmer tell my car that it is OK to violate the speed limit and set rules for when and where that can be done I open myself and my company up to liability. Any accident, whether caused by a car I programmed or not, where my car was speeding or crossed the center line or passed on a double yellow line or did anything else illegal will be blamed on my car and my programming. It doesn't matter if Google cars have 1.2 million miles of experience or 120 million miles when it comes down to liability. The car will be programmed to appease the lawyers.Two lane road ... no passing zone ... bicycle or pedestrian in the lane ahead. The only legal solution is to slow to the obstruction's speed until the end of the no passing zone (or until the obstruction has left the roadway). Will your lawyers allow you to teach your car to violate a no passing zone so you can pass the obstruction? I doubt my lawyers will let me take that risk. "I cannot advise you to violate any law."


So you are saying that because of lawyers we will never have self driving cars.


----------



## James Long

yosoyellobo said:


> So you are saying that because of lawyers we will never have self driving cars.


I am saying that the cars will be operated in the safest way possible to appease the lawyers.
If the safest way possible to operate is to pull over and park, so be it.


----------



## Tom Robertson

James Long said:


> I am saying that the cars will be operated in the safest way possible to appease the lawyers.
> If the safest way possible to operate is to pull over and park, so be it.


Fortunately innovators don't give up quite so easily. 

I'm pretty sure the cars won't be able to violate speed limits. There rarely is a sufficient reason to speed in the course of driving safely.

As for the no-passing zone, I think that only applies to not passing a motor vehicle. Passing a pedestrian or bicycle in a no passing zone is not illegal. And since the self-driving cars won't cause an accident--there isn't a problem with their passing when it's safe. This is hardly a unique, once in a lifetime, the programmers will never have thought about it type problem. Nor is driving in a barrel marked lane, for that matter.

I don't think anyone, other than you, is saying this "will solve all of the world's problems". We know there will be limitations that will be fewer and fewer all the time. And that there are plenty of other world problems to solve.

And we know there are people who will fight technology and not use these cars. Ok. Each generation has their people who stick to their version of "would rather keep their horses and buggy whips." (Thanks, dennisj00). 

Peace,
Tom


----------



## Beerstalker

I agree passing a pedestrian or bicyclist in a no passing zone is most likely legal, however, most states have laws like you have to be at least 3 feet away from them, etc etc. So being 3 feet away from them without crossing the solid yellow line is going to be pretty much impossible. My guess is that the auto driving car will not be willing to cross that solid yellow line, and it will instead slow down and stay behind the obstruction until you are outside of the no passing zone, or the obstruction decides to move over to the shoulder, etc.

This brings up another one that I deal with all the time that I wonder how these cars will deal with. I pass farm implements on the road all the time, many times I have to drive off the road into the ditch etc in order to get past. I'm not just talking about passing vehicles going in the same direction, but vehicles coming from the opposite direction too. What are these vehicles going to do in these cases. My guess is pull over and stop until the farm equipment goes by if travelling in opposite directions, or refuse to pass them altogether if going in the same direction.


----------



## phrelin

Beerstalker said:


> I agree passing a pedestrian or bicyclist in a no passing zone is most likely legal, however, most states have laws like you have to be at least 3 feet away from them, etc etc. So being 3 feet away from them without crossing the solid yellow line is going to be pretty much impossible. My guess is that the auto driving car will not be willing to cross that solid yellow line, and it will instead slow down and stay behind the obstruction until you are outside of the no passing zone, or the obstruction decides to move over to the shoulder, etc.
> 
> This brings up another one that I deal with all the time that I wonder how these cars will deal with. I pass farm implements on the road all the time, many times I have to drive off the road into the ditch etc in order to get past. I'm not just talking about passing vehicles going in the same direction, but vehicles coming from the opposite direction too. What are these vehicles going to do in these cases. My guess is pull over and stop until the farm equipment goes by if travelling in opposite directions, or refuse to pass them altogether if going in the same direction.


I hate to say this, but it sounds more and more like the discussion is about a literally "autonomous" mode of transportation. Might I suggest a horse? When i had a horse, it made all kinds of decisions for me, not always what I wanted it to do. That's why I prefer a car.


----------



## dennisj00

As an avid bicyclist (~4000 miles / year) I find there are two groups of drivers on rural roads. The first will follow you forever if there's a double yellow line, the second will pass no matter what.

Actually, there's a third group, the ******** that will honk, yell and throw stuff as they pass. GoPros help capture their license plate!

There's also two classes of auto-drive cars. No controls, read taxis for urban areas that take you from point A to B without the creepy uber driver. Further in the future.

The second, like Tesla's auto-drive with conventional controls that beeps if it needs your input. That one is pretty much available today.


----------



## James Long

Tom Robertson said:


> I think that only applies to not passing a motor vehicle. Passing a pedestrian or bicycle in a no passing zone is not illegal.


You think or you know? What would the car company's lawyer say? How much liability are they willing to accept.



Tom Robertson said:


> And since the self-driving cars won't cause an accident--


That is one of those overconfident pie in the sky statements that I am talking about. You and people who agree with you are so confident that computers are better drivers than people. And so dismissive of any issue that would stand in the way of self driving vehicles.



Tom Robertson said:


> And we know there are people who will fight technology and not use these cars. Ok. Each generation has their people who stick to their version of "would rather keep their horses and buggy whips." (Thanks, dennisj00).


Insulting comments aside, this is not about keeping the horses and buggy whips (although I'd like to see what a self driving car would do while passing a buggy). It is about accepting the limitations of the technology and not glossing over SERIOUS issues that face self driving vehicles.

Speaking of buggys, I drive in an area of the country with a lot of horse drawn vehicles. I have learned how to safely pass these vehicles traveling in either direction (passing or meeting opposing traffic). My preferences for how to pass a buggy works. By your own standards my procedure must be good because I have never hit a buggy. No accident is proof of concept?

However one day earlier this year I watched a semi truck hit a buggy and kill the horse. The driver of the semi operated his vehicle 100% within the law ... refusing to cross the center line to give the buggy extra room. The horse was spooked and had no room for error. A self driving car would probably make the same MISTAKE of operating 100% within the law and causing the death of the horse (perhaps the car passengers too since car vs horse has caused car passenger fatalities).

It is only a matter of time until a self driving car is the cause of a death. Perhaps not legally at fault as in the incident described above, but still a death that DID NOT NEED TO OCCUR, You can happily blame the horse ... the witnesses can spend the rest of their lives trying to forget the sight of a dying horse. Driving is a serious task. Not a joke.


----------



## James Long

dennisj00 said:


> As an avid bicyclist (~4000 miles / year) I find there are two groups of drivers on rural roads. The first will follow you forever if there's a double yellow line, the second will pass no matter what.


I prefer a happy medium. Wait a safe distance back for an opportunity. Pass only when I know that it is safe and give the entire lane to the bicycle (or pedestrian or buggy). Not necessarily waiting for the end of the no-passing zones.



dennisj00 said:


> The second, like Tesla's auto-drive with conventional controls that beeps if it needs your input. That one is pretty much available today.


That is what I expect for the near future (next decade or more). "Self drive" will be advanced cruise control with the car's human driver responsible (and liable) for the safe operation of the vehicle.


----------



## phrelin

I'm reading here about the wonderfulness of the Tesla system. Actually we have this article Hands-off drivers post videos of swerving while testing out Tesla's autopilot which offers several videos and notes:



> Musk said at the press conference that a fully autonomous Model S could be out by 2018.
> 
> While the autopilot is clearly in beta mode, the company also got an unrelated blow from _Consumer Reports_ yesterday when they rated the Model S on reliability, finding that the car had "too many problems to recommend."
> 
> They'll figure it out eventually, but until then, keep your hands on the wheel.


_Road & Track_ in an earlier article with video noted in This Tesla Autopilot Close Call Shows We Need To Be Responsible With Autonomous Tech:



> Another thing Musk said and Tesla's website explicitly states is that this version of Autopilot is a beta and really isn't meant for complete hands-off use. Now, this is presumably largely to relieve itself of any liability issues and not tick off regulators, but Tesla's knows it also can't oversell this current version of Autopilot to consumers. The only problem is that Tesla held a bunch of test drives with journalists this week (of which one of our editors attended) where drivers were allowed to fully remove their hands from the wheel for lengthy periods of time. You've surely seen one of the videos by now. They're everywhere.
> 
> This is an issue. It's an issue because it's a case of do as I say not as I do, and it encourages Model S owners to perhaps put a little too much faith in their car without really understanding how Autopilot works and when it doesn't. With this launch, the company-and to be fair, plenty of press-made a spectacle of a technology that is very, very serious. Make no mistake: There are many benefits to semiautonomous and autonomous cars, and they are coming. But while everyone is holding their hands up in the air and laughing and talking about how creepy Autopilot is, what gets lost is the gravity of the technology and the fact that lives are still ultimately at stake. It's not a parlor trick.


The technology is experimental, "not yet ready for prime time." It's promising. But we're a long ways from "releasing the reins and giving the horse his head."

:rant:
The reason I object to the use of "autonomous" is that it is a misuse of a word for "hype"; it is not being used by researchers, but by the marketing types. The Steve Jobs approach was fine because in the end who gives a crap if an iPad doesn't work. But now we're maybe out there with real "angry birds" whose tree the nest with chicks was in just got knocked down because the "autonomous" vehicle just decided to drive off the road into a tree killing the driver and his kids who were eating burgers and fries and watching TV. Musk and the others cannot use the Steve Jobs marketing system because most people still don't understand things like "beta".
:rant:


----------



## Stewart Vernon

We have automated manufacturing lines that can't consistently manufacture parts without error. I got a big bag of Tootsie Rolls the other day and several of them were incorrectly wrapped with wrappers that were cut significantly off-center. I also bought a pack of Ramen noodles recently and the seasoning packages inside all of them were miscut... a couple were so bad that the perforation was in the middle in the package and the seasoning had spilled out already inside with the noodles!

I mention these because... this is stuff that has been around longer than I have been alive. Competent people would not make these manufacturing errors OR if they did accidentally miscut something, they could make sure those didn't end up in the shipping boxes to customers. The computers and automated manufacturing lines clearly can't manage this to perfection... so imperfections go through.

So, we have multiple levels of problems with the automated cars.

1. People are designing and programming them.
2. People are designing and constructing the machines that will produce the components used to assemble them.
3. People are testing them.
4. People will be maintaining them once they are in service.

Even if you assume the design and programming are flawless, there are so many other opportunities to introduce problems along the way... then there's wear and how will the same car perform after it has been on the road a while and it's mechanicals are functioning differently than spec-design says?

And as we have noted... I do not want liability if I'm not the driver. The manufacturer isn't going to want liability if their car isn't in complete control, and even then they are going to look for ways to say "the driver should have seen a dangerous situation and took steps to avoid it" and since we know there will be an accident at some point, that will be a fun day in court for those involved.

Trains have fixed paths (they must stay on the track) so there is less opportunity for error, and yet trains do sometimes jump tracks and the computer scheduling and shared tracks does allow for the potential for scheduling conflicts when all doesn't go to perfection. Cars are going to be a very tricky thing to make completely self-driving in a way that works, is safe, and becomes accepted by everyone. Also, what about all the people who love to drive for fun? If they aren't driving, where's their fun?


----------



## Tom Robertson

James Long said:


> You think or you know? What would the car company's lawyer say? How much liability are they willing to accept.





James Long said:


> I prefer a happy medium. Wait a safe distance back for an opportunity. Pass only when I know that it is safe and give the entire lane to the bicycle (or pedestrian or buggy). Not necessarily waiting for the end of the no-passing zones.


Sounds like an excellent approach. Yet why do you seem to feel such can't be implemented in a self-driving system? You've hit upon a solution that is safe, so why wouldn't the lawyers be satisfied with a safe solution? There isn't liability when there isn't an accident.



James Long said:


> That is one of those overconfident pie in the sky statements that I am talking about. You ... are so confident that computers are better drivers than people.


Yes. Yes, I am. Or rather that they can be. Google is doing it right. Very carefully, very studiously, baby steps.



James Long said:


> And so dismissive of any issue that would stand in the way of self driving vehicles.


In an adult conversation, isn't there the possibility for people to share knowledge that refutes issues that have already been dealt with? Or that shows how an issue isn't an issue at all? You have raised one acknowledged issue--snow (and gravel.) Google has identified those as an issue. I've confirmed it, not dismissed it in the least.

(And I've asked for your list of other topics where you thought a computer couldn't handle the situation. Part of showing a genuine interest in adult conversation.)



James Long said:


> ...this is not about keeping the horses and buggy whips (although I'd like to see what a self driving car would do while passing a buggy). It is about accepting the limitations of the technology and not glossing over SERIOUS issues that face self driving vehicles.


Not everyone feels all issues are still SERIOUS, especially when there are simple solutions to the issues. Pointing those solutions out is not glossing over them. It is conversing about them.

By the way, I suspect dennissj00's grandfather's stories were about people who avoided cars because they too thought the issues were real and serious. Yet the issues were solved or never real.



James Long said:


> Speaking of buggys, I drive in an area of the country with a lot of horse drawn vehicles. I have learned how to safely pass these vehicles traveling in either direction (passing or meeting opposing traffic). My preferences for how to pass a buggy works. By your own standards my procedure must be good because I have never hit a buggy. No accident is proof of concept?
> 
> However one day earlier this year I watched a semi truck hit a buggy and kill the horse. The driver of the semi operated his vehicle 100% within the law ... refusing to cross the center line to give the buggy extra room. The horse was spooked and had no room for error. A self driving car would probably make the same MISTAKE of operating 100% within the law and causing the death of the horse (perhaps the car passengers too since car vs horse has caused car passenger fatalities).
> 
> It is only a matter of time until a self driving car is the cause of a death. Perhaps not legally at fault as in the incident described above, but still a death that DID NOT NEED TO OCCUR, You can happily blame the horse ... the witnesses can spend the rest of their lives trying to forget the sight of a dying horse. Driving is a serious task. Not a joke.


Let's modify your example--for the simple reason you have an approach is safe: giving the whole lane to the horse and buggy. What if the semi had done the same? And, as the semi passed the horse, fully in the other lane, the horse still spooked, went off the edge or into the truck or wherever? And thus died. (For simplicity, let's make it also a given that the section of road was a passing zone.) So the semi is 100% legal, the approach is nearly perfectly safe, yet the horse still spooked. Horse still died.

There will be deaths. People, horses, dogs, cats die on a consistent basis--100% will die. You can't prevent them all.

But what if we could take the 30,000 car deaths each year and reduce them to 100? Would all-or-nothing thinking stop self-driving cars because there might, possibly be situations where a " self-driving car is the cause of a death. Perhaps not legally at fault...?" Or would sane, reasoned thinking say, "100 is a whole lot better than 30,000?"



James Long said:


> That is what I expect for the near future (next decade or more). "Self drive" will be advanced cruise control with the car's human driver responsible (and liable) for the safe operation of the vehicle.


"Advanced cruise control" is available today. Real self-driving cars are in real world tests today, Google is in two known cities today: Mountain View, California and Austin, Texas. I haven't heard about anyone else's tests beyond I know they are ongoing.

Manufacturers are saying things like 2016 (as noted above), 2018, and "within 4 years." So, while I can see some of those dates slipping, I don't see them extending a full decade.

Peace,
Tom


----------



## dennisj00

Even cutting 30,000 auto deaths to 15,000 (just a selected number) would be an awesome reduction. Nothing to argue about. And it looks like it would be a higher reduction than that.


----------



## James Long

Tom Robertson said:


> Manufacturers are saying things like 2016 (as noted above), 2018, and "within 4 years." So, while I can see some of those dates slipping, I don't see them extending a full decade.


I see the dates slipping ... for all the reasons already stated in this thread.

But here is where the state of the art is now ... on the closed course:





And with a pilot in the car for safety:





http://www.edmunds.com/car-news/self-driving-cars-may-redefine-drivers-licenses.html

BTW: Don't forget that horseless carriages were required to have someone walk out front of them until they were accepted by society and deemed safe. That is where we are with self driving cars. It will take time for them to be accepted.


----------



## Drucifer

I can't wait for the future when some drivers will demand they have the right to speed.


----------



## James Long

Drucifer said:


> I can't wait for the future when some drivers will demand they have the right to speed.


We have already had that thread (see the thread about cracking down on left lane drivers).
A nice debate over whether driving the speed limit in the left lane should be illegal.


----------



## Tom Robertson

There are a number of reasons I'm so excited by self-driving cars--aside from I don't like driving most of the time. Most of them revolve around the approach Google is taking and their TED talk: https://www.ted.com/talks/chris_urmson_how_a_driverless_car_sees_the_road?language=en showing how the car "senses" typical situations. Very impressive! They are already to the point of sensing things well before humans can--and thus intelligently driving defensively. The full ted talk is 15:29. The best situation analysis starts at 7:10-ish.

Here is another video showing situations they already have designed around: https://youtu.be/bDOnn0-4Nq8

And a statistic I heard a number of years ago--80% of motorcycle accidents were human error on the part of the motorcyclist. (Taking into account all the cases of driving in poor conditions for a motorcycle, driving will impaired, misjudging the surface, etc.)

Here is google's general site for the project: https://www.google.com/selfdrivingcar/There you can read the monthly status reports, all the accident reports, Google's approach, and many other goodies.

Peace,
Tom


----------



## Tom Robertson

James Long said:


> I see the dates slipping ... for all the reasons already stated in this thread.
> 
> But here is where the state of the art is now ... on the closed course:


This is the state of the art in a marketing situation where they won't put non-google employees at risk, using cars that aren't street legal yet. Not the state of the art as exists on real streets today. (Yes, real streets still require a manual mode, sometimes with safety escorts nearby--but the cars can intelligently handle many situations--more each week.)



James Long said:


> And with a pilot in the car for safety:
> 
> 
> 
> 
> 
> http://www.edmunds.com/car-news/self-driving-cars-may-redefine-drivers-licenses.html
> 
> BTW: Don't forget that horseless carriages were required to have someone walk out front of them until they were accepted by society and deemed safe. That is where we are with self driving cars. It will take time for them to be accepted.


The second video is one of the examples that has me excited. It is a short form of the Ted Talk in March of this year--7 months ago! Look at how they already sensed normal, everyday things. Notice the examples, right along the lines of the ones we've been discussing here. Safely handling all the situations.

In the Ted Talk they show how they pick up on things humans miss or before humans would see them. That is why I know these will be safer than human driven cars at some point.

Yes--in some states/cities there are still blue laws requiring horseless carriages to be lead by a person. Laws passed by people who were frightened by technology rather than accepting of it. "Oh, my! They run on explosives!!" Just so, people frightened by self-driving cars will be just as out of touch with technology advances.

Peace,
Tom


----------



## Tom Robertson

dennisj00 said:


> Even cutting 30,000 auto deaths to 15,000 (just a selected number) would be an awesome reduction. Nothing to argue about. And it looks like it would be a higher reduction than that.


Bingo!

Peace,
Tom


----------



## yosoyellobo

Drucifer said:


> I can't wait for the future when some drivers will demand they have the right to speed.


Yesterday was Back To The Future Day and I am piss off we don't have flying cars, but I take a self driving one in a few years.


----------



## Tom Robertson

yosoyellobo said:


> Yesterday was Back To The Future Day and I am piss off we don't have flying cars, but I take a self driving one in a few years.


Speaking of flying cars, which actually has a tie-in to this discussion, flying cars "seem" so very close. One of the concerns about self-driving cars is the legislation--yet in the case of flying cars, the FAA already has a flying car pilot's license available. It is limited to altitude among other things as I understand. Yet not as difficult to obtain--when we finally get flying cars. 

Before anyone mentions flying cars as an example of how self-driving cars will "seem" to be close for a long-time, flying cars don't have Google's incredibly resources and backing. Or Tesla's. Or Mercedes Benz. 

Peace,
Tom


----------



## Drucifer

*Transcontinental US Record Aboard Tesla Model S P85D with Autopilot Claimed*



> Earlier this year, Carl Reese and Deena Mastracci made a 3,011-mile (4,846-km) drive from Los Angeles to New York in 58 hours and 51 minutes in a Tesla Model S P85D to set a record for the coast-to-coast journey for an electric vehicle. Now Reese and Mastracci, joined by Alex Roy, have beaten that record with the help of Tesla's new Autopilot software.
> 
> . . . .


----------



## phrelin

Tom Robertson said:


> Before anyone mentions flying cars as an example of how self-driving cars will "seem" to be close for a long-time, flying cars don't have Google's incredibly resources and backing. Or Tesla's. Or Mercedes Benz.
> 
> Peace,
> Tom


It is exactly those "resources" that should cause people to pause and reflect. Those resources are "money" which, yes, may buy good employees and equipment for design and research, but also fund marketing and lobbying and profits. Right now they are doing the marketing and lobbying because they've invested a lot of money in "unassisted" motor vehicle r&d, an investment in which they are being pressured to start deriving earnings from.

Maybe you are comfortable with it, but the company run by that guy from Sweden - Volvo - also makes Mack trucks. Just imagine the possible earnings if you could launch a fleet of tractor-trailers that drive themselves. The only issue will be what is an acceptable loss level acceptable in the beginning? And by loss, I don't mean trucks that stop at the wrong warehouse. I mean "oops" deaths per million miles driven. Volvo will gladly take on the liability through an independent cooperative insurance company.

The curious issue is that our roads are falling apart but we avoid that issue and will gladly succumb to the same urge that leads us to buy the new iPhone. We will want to run out and start buying computer controlled vehicles driven with no assistance.

My wife and I have a preorder in for two shiny new Surface Pro 4's to replace our not-even-slightly-functionally-outdated Surface Pro 2's partly because the 4 has a nifty cool pen thingy we want to try out. The autonomous car may well be a lifesaver for us. We'll be due for a new car in a few years if we're still alive. I'm surewe will have that impulse to buy an "autonomous" car because frankly we may or may not be the greatest drivers because of age. It might be a great option for folks like us.

But the long term profit is going to be in that fleet of autonomous tractor-trailers. So I'm on the side of waiting until "they" get back to work and quit the marketing stunts like this one:



Drucifer said:


> *Transcontinental US Record Aboard Tesla Model S P85D with Autopilot Claimed*


There are plenty of hair-raising videos out there right now of failures on the part of the Tesla Autopilot. I posted links to articles with them above.


----------



## Drucifer

Tom Robertson said:


> Speaking of flying cars, which actually has a tie-in to this discussion, flying cars "seem" so very close. One of the concerns about self-driving cars is the legislation--yet in the case of flying cars, the FAA already has a flying car pilot's license available. It is limited to altitude among other things as I understand. Yet not as difficult to obtain--when we finally get flying cars.
> 
> Before anyone mentions flying cars as an example of how self-driving cars will "seem" to be close for a long-time, flying cars don't have Google's incredibly resources and backing. Or Tesla's. Or Mercedes Benz.
> 
> Peace,
> Tom


For flying vehicles, pilotless is definitely my preferences. Because right now I feel safe sleeping in my home. Put the nuts I see on the roads now in the air and I will be looking for a cave to live in.


----------



## Tom Robertson

phrelin said:


> It is exactly those "resources" that should cause people to pause and reflect. Those resources are "money" which, yes, may buy good employees and equipment for design and research, but also fund marketing and lobbying and profits. Right now they are doing the marketing and lobbying because they've invested a lot of money in "unassisted" motor vehicle r&d, an investment in which they are being pressured to start deriving earnings from.
> 
> Maybe you are comfortable with it, but the company run by that guy from Sweden - Volvo - also makes Mack trucks. Just imagine the possible earnings if you could launch a fleet of tractor-trailers that drive themselves. The only issue will be what is an acceptable loss level acceptable in the beginning? And by loss, I don't mean trucks that stop at the wrong warehouse. I mean "oops" deaths per million miles driven. Volvo will gladly take on the liability through an independent cooperative insurance company.
> 
> The curious issue is that our roads are falling apart but we avoid that issue and will gladly succumb to the same urge that leads us to buy the new iPhone. We will want to run out and start buying computer controlled vehicles driven with no assistance.
> 
> My wife and I have a preorder in for two shiny new Surface Pro 4's to replace our not-even-slightly-functionally-outdated Surface Pro 2's partly because the 4 has a nifty cool pen thingy we want to try out. The autonomous car may well be a lifesaver for us. We'll be due for a new car in a few years if we're still alive. I'm surewe will have that impulse to buy an "autonomous" car because frankly we may or may not be the greatest drivers because of age. It might be a great option for folks like us.
> 
> But the long term profit is going to be in that fleet of autonomous tractor-trailers. So I'm on the side of waiting until "they" get back to work and quit the marketing stunts like this one:
> 
> There are plenty of hair-raising videos out there right now of failures on the part of the Tesla Autopilot. I posted links to articles with them above.


Basically it sounds like we need to evaluate each company's approach separately. Much as we evaluate individual company's products uniquely. 

Google is doing it about as "right" as I can imagine. Slow, methodical steps, leading up to today where they need the marketing stunts for the feedback to the designers. And to educate the people on the streets of Mountain View and Austin as to what they will see. Sounds perfectly reasonable.  One report has it that Google has learned from the failure of Glass, though I suspect there is more interest in cars than smart glasses.

Tesla seems to be marketing before it's ready. That is scary.

I'm not sure about where other companies are. Other manufacturers are being asked about cars by the press moreso than marketing them to the press.

Peace,
Tom


----------



## mrdobolina

James Long said:


> Every time I have seen a "fool proof" computer humanity seems to come up with a more robust fool.


Perfectly stated!


----------



## yosoyellobo

I wonder what is going to be the Auto Insurance position on driverless cars. If the manufacturers are successful the Geico gecko is road kill, it's what they do.


----------



## Tom Robertson

yosoyellobo said:


> I wonder what is going to be the Auto Insurance position on driverless cars. If the manufacturers are successful the Geico gecko is road kill, it's what they do.


They won't be selling liability products (or perhaps will be selling reduced liability for ignoring maintenance requirements), yet there are still going to be uninsured motorist and comprehensive coverage requirements. Especially for cars that are financed.

Peace,
Tom


----------



## James Long

> The Alliance of Automobile Manufacturers, which represents a dozen carmakers, is instead looking to the federal government to become involved. The goal, automakers say, is to avoid a patchwork of laws around the country.
> 
> But Congress has been quiet on the issue, and the federal agency in charge of road safety, the National Highway Traffic Safety Administration, is simply trying to understand a technology that is racing ahead of the law.
> 
> "We have a lot of catching up to do," Mark R. Rosekind, the head of the safety agency, told reporters last month. "The first time somebody gets hurt or someone is fatally injured, we are the ones who are going to get the phone call."
> 
> source


A nice summary of current laws:
http://cyberlaw.stanford.edu/wiki/index.php/Automated_Driving:_Legislative_and_Regulatory_Action


----------



## Drucifer

*Self-Driving Car Guides Itself Through 2,400 km Journey Across Mexico*



> A self-driving vehicle developed by researchers in Germany has undertaken a monster 2,400 km (1,500 mi) journey from the US/Mexico border at Nogales to Mexico City without any guidance from a human hand. While the lengthy road trip took place mostly on highways, the AutoNOMOS car also had to contend with potholes and city streets before safely pulling into Mexico City to complete the longest trip ever completed by an autonomous vehicle in Latin America.
> 
> . . . .


----------



## Christopher Gould

http://www.popsci.com/who-will-driverless-cars-decide-to-kill?src=SOC&dom=fb


----------



## yosoyellobo

Christopher Gould said:


> http://www.popsci.com/who-will-driverless-cars-decide-to-kill?src=SOC&dom=fb


The answer is easy, you sacrifice the possible 90% who might possibly save by the use of driverless cars.


----------



## James Long

yosoyellobo said:


> The answer is easy, you sacrifice the possible 90% who might possibly save by the use of driverless cars.


Sacrifice the older people. They are closer to death anyways. 
Or sacrifice the non-autonomous car owners.

It is a good question ... and it has to be answered in a split second running through all the options. Good luck car (and everyone around it).


----------



## phrelin

Ah, well, apparently the State of California is taking a run at the problem of regulating these cars according to *Google, Tesla, others wait for DMV's self-driving rules* but the rule-makers are not literally considering these "self-driving" nor "autonomous." From the article:



> That leaves some in the industry worried about "The Handoff," that moment -- perhaps just a split-second -- when humans must take control back from the machine to avert disaster. Today, drivers of cars equipped with semi-autonomous tools such as automatic braking, adaptive cruise control and sensors that help keep the car in its lane are supposed to be monitoring and supervising whatever the car can do on its own. But as the cars get smarter and able to navigate themselves, the humans in the driver's seat will increasingly grow comfortable checking text messages, scanning a newspaper and opening up the makeup kit.
> 
> "Humans are really bad at evaluating low-probability events," said Steven Waslander, an engineering professor at the University of Waterloo in Canada. "You've been driving the same way to and from work every month and then there's one moment when suddenly you have to be paying attention."
> 
> This is not just a theoretical problem anymore, now that Tesla Motors is allowing drivers to switch on its "Autopilot Mode" that includes adaptive cruise control and letting the car change lanes by itself after the human driver turns on a signal.


In other words, if you own one of these and something bad happens you don't get to say "The car did it."


----------



## Drucifer

phrelin said:


> Ah, well, apparently the State of California is taking a run at the problem of regulating these cars according to *Google, Tesla, others wait for DMV's self-driving rules* but the rule-makers are not literally considering these "self-driving" nor "autonomous." From the article:
> 
> In other words, if you own one of these and something bad happens *you don't get to say "The car did it."*


All vehicles in the near future will have black boxes.

I would want vehicle computers to be able override drivers when the computer spots speeding, tailgating, or frequent lane changes and of course reeving all over the road. Maybe even notifying the police.


----------



## James Long

Drucifer said:


> All vehicles in the near future will have black boxes.


All new vehicles. It will take a while until old vehicles are retrofitted (although some vehicles have sensors that could be considered rudimentary black boxes - or at least tattletales).



Drucifer said:


> I would want vehicle computers to be able override drivers when the computer spots speeding, tailgating, or frequent lane changes and of course reeving all over the road. Maybe even notifying the police.


The car was drunk, officer. I filled up with e85 instead of e15. That is why it was weaving.


----------



## phrelin

Drucifer said:


> I would want vehicle computers to be able override drivers when the computer spots speeding, tailgating, or frequent lane changes and of course reeving all over the road. Maybe even notifying the police.


So the computer would prevent you from hurrying to the hospital, getting away from the bad guys shooting at you, or otherwise making a choice that would be in conflict with the law? But the bad guys' car would not prevent them from shooting their guns because it would conflict with the Second Amendment? Or does the car become an autonomous new deputy law enforcement officer in which laws made by your favorite all-wise and wonderful legislature will be absolute? Or does the car interpret laws that are somewhat vague or confusing?

Frankly, I don't buy this whole "things are going to become really safe" stuff. Sure we didn't have so many "distracted" drivers before texting and multitasking became buzz words. Maybe the new car will help with the new unnecessary but addictive distractions. But I see the cars as being able to prevent accidents in routine commute driving situations until the 1977 Malibu driven by a drunk driver plows into the line of barely-stopped-in-time self-driving vehicles.

Let's stop selling these things that don't really exist yet like Steve Jobs sold iPads. iPads are not inherently dangerous if some App fails. A self-driving Tesla with a flawed steering algorithm is. There is a much bigger picture here than the "device" - to make a pun the device possibilities shouldn't _drive_ the decision-making or habit formations.


----------



## James Long

phrelin said:


> Frankly, I don't buy this whole "things are going to become really safe" stuff.


That "pie in the sky" will not be baked until all vehicles are self driving. Until then proponents will continue to declare self driving cars safe and blame all problems on non-automated driving. As long as they can blame someone else, everything is awesome.


----------



## yosoyellobo

James Long said:


> Sacrifice the older people. They are closer to death anyways. Or sacrifice the non-autonomous car owners.It is a good question ... and it has to be answered in a split second running through all the options. Good luck car (and everyone around it).


A split second is an eternity for the computer running theses cars. If they had to chose which person to save I suspect they will flip a coin which I believe is what God would do.


----------



## phrelin

yosoyellobo said:


> A split second is an eternity for the computer running theses cars. If they had to chose which person to save I suspect they will flip a coin which I believe is what God would do.


Or the car will run various kinds of instant scans and select based on a list "they" prepared.


----------



## Tom Robertson

phrelin said:


> So the computer would prevent you from hurrying to the hospital, getting away from the bad guys shooting at you, or otherwise making a choice that would be in conflict with the law? But the bad guys' car would not prevent them from shooting their guns because it would conflict with the Second Amendment? Or does the car become an autonomous new deputy law enforcement officer in which laws made by your favorite all-wise and wonderful legislature will be absolute? Or does the car interpret laws that are somewhat vague or confusing?
> 
> Frankly, I don't buy this whole "things are going to become really safe" stuff. Sure we didn't have so many "distracted" drivers before texting and multitasking became buzz words. Maybe the new car will help with the new unnecessary but addictive distractions. But I see the cars as being able to prevent accidents in routine commute driving situations until the 1977 Malibu driven by a drunk driver plows into the line of barely-stopped-in-time self-driving vehicles.
> 
> Let's stop selling these things that don't really exist yet like Steve Jobs sold iPads. iPads are not inherently dangerous if some App fails. A self-driving Tesla with a flawed steering algorithm is. There is a much bigger picture here than the "device" - to make a pun the device possibilities shouldn't _drive_ the decision-making or habit formations.





James Long said:


> That "pie in the sky" will not be baked until all vehicles are self driving. Until then proponents will continue to declare self driving cars safe and blame all problems on non-automated driving. As long as they can blame someone else, everything is awesome.





phrelin said:


> Or the car will run various kinds of instant scans and select based on a list "they" prepared.


Guys, you really, really should see the TED talk by Chris Urmson, especially starting at this point: 




I highly recommend the whole or at least starting at the 9 minute mark. But the very best is the link above. It breaks down how Google cars "see" a bicycle before humans can, anticipates, and safely negotiates where the adjacent cars (which block a human's view) don't correctly anticipate the bike.

This is not "pie in the sky", this is real. This is not a list, so much as a continuous update in near-real-time of anticipation. Every self-driving car, built to this level of testing and development, will potentially save lives without requiring all the other cars being smart. Since it can avoid stupid drivers, it adds to the safety already.

No, it can't stop all accidents. There is no reason to set the bar that impossibly high. Cuz you don't have it now.

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> Guys, you really, really should see the TED talk by Chris Urmson, especially starting at this point:
> 
> 
> 
> 
> I highly recommend the whole or at least starting at the 9 minute mark. But the very best is the link above. It breaks down how Google cars "see" a bicycle before humans can, anticipates, and safely negotiates where the adjacent cars (which block a human's view) don't correctly anticipate the bike.
> 
> This is not "pie in the sky", this is real. This is not a list, so much as a continuous update in near-real-time of anticipation. Every self-driving car, built to this level of testing and development, will potentially save lives without requiring all the other cars being smart. Since it can avoid stupid drivers, it adds to the safety already.
> 
> No, it can't stop all accidents. There is no reason to set the bar that impossibly high. Cuz you don't have it now.
> 
> Peace,
> Tom


I'm really serious when I said I can see the benefits of this technology for my wife and me - we're both over 70 and a car that can do what you are describing could help seniors like us who live in rural areas where there is no real transit. It would be a dream for people like us. With that said....

Google doesn't manufacture or sell cars. Sure, I guess they could tool up and start or buy Volkswagon or something. But then they would have to learn to operate a business outside the Silicon Valley bubble off the web.

Tesla released a beta version of its software and near-disasters are being averted daily because of flaws. So far it appears every Tesla owner who downloads it knows what "beta" means and weren't confused by the very misleading journalist "test drive, write glowing story" opportunity.

Volvo's head honcho wants to get government out of his way because he has a guy in r&d who knows BASIC.

I love the potential. But I have yet to see any company that can responsibly implement it for a mass market. That's my focus. I've followed Google's efforts every step of the way. If their researchers were in a university and were planning to release their stuff with a $10 per car license, I'd be as enthusiastic. I still haven't heard their mass market plans. But I'll check your link.


----------



## Tom Robertson

phrelin said:


> I'm really serious when I said I can see the benefits of this technology for my wife and me - we're both over 70 and a car that can do what you are describing could help seniors like us who live in rural areas where there is no real transit. It would be a dream for people like us. With that said....
> 
> Google doesn't manufacture or sell cars. Sure, I guess they could tool up and start or buy Volkswagon or something. But then they would have to learn to operate a business outside the Silicon Valley bubble off the web.
> 
> Tesla released a beta version of its software and near-disasters are being averted daily because of flaws. So far it appears every Tesla owner who downloads it knows what "beta" means and weren't confused by the very misleading journalist "test drive, write glowing story" opportunity.
> 
> Volvo's head honcho wants to get government out of his way because he has a guy in r&d who knows BASIC.
> 
> I love the potential. But I have yet to see any company that can responsibly implement it for a mass market. That's my focus. I've followed Google's efforts every step of the way. If their researchers were in a university and were planning to release their stuff with a $10 per car license, I'd be as enthusiastic. I still haven't heard their mass market plans. But I'll check your link.


Well said.

My hope and thoughts are Google will find a way to license the technology they are developing. The individuals are very invested in getting the tech out there and the company has been investing in this for a long time. They either have a plan for it or a plan to develop the plan. I think some of the test track "marketing event" with people from Mountain View and Austin was as much about learning what people want in a car as it was to let them know they will see them on the roads.

I suspect Google could do the same thing with this technology as they did with Android--manufacture their own brand of cars and license to other manufacturers. One of their videos shows how they learned from building their own test cars.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> This is not "pie in the sky", this is real.


Tesla is real:


phrelin said:


> Tesla released a beta version of its software and near-disasters are being averted daily because of flaws.


"Pie in the sky" refers to the proponents who refuse to listen to any rebuke. "All problems will be solved." That is a prediction, not a reality.

If you were to unload a self driving car from a trailer in my driveway and tell it to get me to work how well would it do? Would the car be able to handle roads it has not seen? Are you assuming that some other car has already seen those roads? Would it be limited to roads other cars have seen? Would it require an operator to take over on unknown roads? Or is full mobility on EVERY road something that will come "in the future" and is not part of today's reality?


----------



## Tom Robertson

James Long said:


> Tesla is real:
> 
> "Pie in the sky" refers to the proponents who refuse to listen to any rebuke. "All problems will be solved." That is a prediction, not a reality.
> 
> If you were to unload a self driving car from a trailer in my driveway and tell it to get me to work how well would it do? Would the car be able to handle roads it has not seen? Are you assuming that some other car has already seen those roads? Would it be limited to roads other cars have seen? Would it require an operator to take over on unknown roads? Or is full mobility on EVERY road something that will come "in the future" and is not part of today's reality?


"Pie in the Sky" also refers to derogatory predictions that problems can't be solved. Or can't be solved for quite a long time.

Since many of the "problems" that have been listed here have already been solved, evidenced by the videos, it seems fair for me to show why I'm certain the problems have been solved.

And now new "problems" are queried that are also answered in the videos. 
1) Your home to your work? Not really sure. While I know the general area of the country, I don't know specifics, you could live in a fairly rural area next to the city. Generally speaking, if google has mapped the roads along the path and they are paved, it will get you there. 
2) Handle roads it hasn't seen? Yes. It can handle situations and roads it hasn't seen. Albeit roads that Google has seen.
3) I'm assuming the roads have been seen by the mapping software google uses, namely their own. They don't need to have been seen by another google self-driving car.
4) Limited to roads other cars have seen? No. But does need to be mapped, so in a sense Google has seen them.
5) Driver take over on unknown roads? No. But it won't select such a road as it isn't in the map. Were it directed to such a road, it could drive on the road recognizing the key road properties like speed, hazards, lanes, etc. 
6) EVERY road, path, gully, gravelway, private drive, local shortcut? No. Don't be silly. Humans don't know about all of them either. 

Will there be roads that aren't on the maps? Heck yes. Maps are always behind the construction. But that is a navigation issue, not a driving issue. The human driver might have to direct the car much like a human might have to direct a taxi to a brand new location: describing the turns, letting the car drive as directed.

To my knowledge, there is one class of problems to be solved--gravel and snow covered roads that obliterate the road markers. Yes, there are still many tests to run--to verify and to perhaps tweak the algorithms. Yet the present reality in Mountain View and Austen is amazing. The videos show how the cars already handle things that aren't in the driver's manual--like duck crossings. Or little (child) cars.  Or human drivers who turn illegally in front of the self-driving car.

Watch the TED talk. It is quite enlightening as to the real state of the art. And shows how the algorithms recognize and categorize things that aren't specifically "known".

Peace,
Tom


----------



## James Long

Tom Robertson said:


> 6) EVERY road, path, gully, gravelway, private drive, local shortcut? No. Don't be silly. Humans don't know about all of them either.


The non silly part is that humans can handle such things. We are built for adaptability. Auto driving cars are not there yet. Today's reality.


----------



## Tom Robertson

James Long said:


> The non silly part is that humans can handle such things. We are built for adaptability. Auto driving cars are not there yet. Today's reality.


Humans can't drive on roads they don't know about anymore than self-driving cars can. Reality for a very long time.

The silly part is manufacturing a context that is not a problem of the self-driving part. Only the navigation part. Which the human will have to know for any travel method: driving him or herself, taxi, shuttle, rickshaw, or self-driving car. Once given the directions to the unknown, unmapped road, the self-driving car can adapt. By looking at the same things a human would--and more.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> Humans can't drive on roads they don't know about anymore than self-driving cars can.


I am trying to think of a nice way not to call that statement complete and utter garbage.

Every vacation that I take I end up on roads I have never driven before. Sometimes GPS is aware of the road, sometimes not. I have a tendency to ignore my GPS and follow what I see outside of my windshield. If your statement was true I would not be able to drive on many roads that I have, as a human being, traveled.

I have been there ... and back again. As a human driver it is well within my skillset.

(I mention vacation because my normal driving is on well worn paths that I have been on before. But there are also weekend trips where I end up on a road not yet traveled. I go where Google streetview has not gone before.)









Sometimes a man just has to take the wheel and drive.


----------



## Tom Robertson

James Long said:


> I am trying to think of a nice way not to call that statement complete and utter garbage.
> 
> Every vacation that I take I end up on roads I have never driven before. Sometimes GPS is aware of the road, sometimes not. I have a tendency to ignore my GPS and follow what I see outside of my windshield. If your statement was true I would not be able to drive on many roads that I have, as a human being, traveled.
> 
> I have been there ... and back again. As a human driver it is well within my skillset.
> 
> (I mention vacation because my normal driving is on well worn paths that I have been on before. But there are also weekend trips where I end up on a road not yet traveled. I go where Google streetview has not gone before.)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sometimes a man just has to take the wheel and drive.


If it is a road and you can see it, then you know about it. 

If you know about it and it is a road, then the self-driving car can drive it--possibly with two caveats--gravel and snow that obscure all lane and road markers. They may have solved the gravel problem or they might still be working on it.

I am not saying google is to the point where every driver will be satisfied to let the car drive all the time. You describe exploring, potentially on paths that aren't roads. And that you still want to drive the car--at least some of the time. Cool! If you are a multiple car family, you might purchase one self-driving car and keep one that is human drive. Or purchase a dual mode car to drive yourself when you explore.

I am saying that most of the problems being raised: adaptability; reactions to unseen roads, conditions, events; incredibly rare and manufactured ethics situations are either already solved or unreasonable in that humans don't run into those situations either. (And are no better prepared.)

Watch the video. See for yourself how the cars adapt today. See how they can identify things they haven't seen before as something similar to what they have seen before. And how they see earlier than humans do, so can anticipate long before a human could.

Peace,
Tom


----------



## Tom Robertson

There are still places horses can go that cars can't. People who go to those places generally still have a car plus have a horse or rent horses when they need. 

There are places for bicycles, rickshaws, helicopters, and airplanes.

The google cars are already capable to handle a large portion of the populous. That is all it takes to be a potentially viable product that is much safer than what we have today. The rest of the product viability is packaging, pricing, and the legals. The packaging probably has the longest lead time in the Google cars, unless they have already arranged manufacturing. Pricing will be interesting when including all the sensors and the liability (presuming for the moment.)

Legals could be coming soon. The original California requirement was for regulation by January 1, 2015. I haven't found a new timeline, as the manufacturers and regulators are still trying to figure out what needs to be and how to certify.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> If it is a road and you can see it, then you know about it.


Not the point ... pretty sure you know it wasn't the point. I guess you just want to fight.

The point is that the first time a human driver sees a particular road they can handle it better than the first time an auto-drive car sees that road. Chances are the auto-drive car won't consider it a road and will refuse to drive on it. Take another route or human take the wheel.

The state of the art in auto-drive technology is not a finished masterpiece. It is a work in progress.


----------



## Drucifer

phrelin said:


> So the computer would prevent you from hurrying to the hospital, getting away from the bad guys shooting at you, or otherwise making a choice that would be in conflict with the law? [SNIP]


Well most injured persons should not be moved.

Just where the hell do you live that people shoot at you while you drive?


----------



## Stewart Vernon

Wonder in this hypothetical imaginary future... what happens to motorcycles? I can't see any such thing as a self-driving motorcycle for the primary reason that the whole point of riding a motorcycle is to drive it. There'd be ZERO fun in riding a self-driving one... so either you'd have to outlaw motorcycles OR you will always have the "unpredictable" human driver being part of the road.


----------



## Tom Robertson

James Long said:


> Not the point ... pretty sure you know it wasn't the point. I guess you just want to fight.
> 
> The point is that the first time a human driver sees a particular road they can handle it better than the first time an auto-drive car sees that road. Chances are the auto-drive car won't consider it a road and will refuse to drive on it. Take another route or human take the wheel.
> 
> The state of the art in auto-drive technology is not a finished masterpiece. It is a work in progress.


You seemingly continue to presume a self-driving car can't read road signs, can't identify road surfaces, can't adapt to things not exactly previously seen, and can't be better than a human?

Why?

Watch the video.

My point is your questions seemed to focus on two things: navigation and adaptability. My answers were to forget navigation--if you can see it, so can the car. And that the cars are already adapting to situations.

My other point is yes there still are and will be places and times you'll want to take your horse and buggy, not a self-driving car. That doesn't mean you will exclusively use the horse.

Peace,
Tom


----------



## Tom Robertson

Stewart Vernon said:


> Wonder in this hypothetical imaginary future... what happens to motorcycles? I can't see any such thing as a self-driving motorcycle for the primary reason that the whole point of riding a motorcycle is to drive it. There'd be ZERO fun in riding a self-driving one... so either you'd have to outlaw motorcycles OR you will always have the "unpredictable" human driver being part of the road.


You'll always have the unpredictable. Not only the unpredictable human, all kinds of unpredictable. So motorcycles won't be anything terribly unique. (Though insurance could become relatively expensive as the pool shrinks.)

Peace,
Tom


----------



## Stewart Vernon

I was actually having a bit of this conversation when I went to pay my car insurance the other day... they seemed to be in the same boat as I am... wondering how the liability thing will go. They seemed to understand what I said about how if I can't drive it, why should I be liable for accidents... same as riding in a taxi or on a bus... beyond that, I still put my driving record up against these computer cars and I come out way ahead of them to date.


----------



## James Long

Tom Robertson said:


> My answers were to forget navigation--if you can see it, so can the car.


It seems the self-driving car proponents want us to forget every potential problem. Sunshine and roses, all problems should be blamed on non-self driving cars or the road or minimized.

How is that self-driving Tesla doing? Who is responsible when one of those crashes in auto-drive mode?

We have talked about a lot in this thread ... one thing seems sure. There are a lot if issues for the auto-drive industry to address. We are not there yet.

According to the US DOT the average age of cars on American roads is 11.4 years. It takes time to replace cars with new models. Perhaps in five years half of the cars on American roads will be replaced ... with a low number of self-driving cars being sold penetration will take a lot longer than five years. We are years away from a noticeable presence let alone the market domination mentioned earlier in this thread.

The pie in the sky is still being baked.


----------



## phrelin

Drucifer said:


> Well most injured persons should not be moved.
> 
> Just where the hell do you live that people shoot at you while you drive?


Almost anywhere in the United States. The most well-known recent incident is reflected in this headline Hundreds say goodbye to 4-year-old killed in road rage shooting.That was in New Mexico but if right now you do a Google news search for road rage shooting, for example, you would come up with many all over the country including:

Reward offered in fatal SW Detroit road rage shooting which tells us:"After a brief argument, the driver of the Cadillac followed behind Gergel and shot into his vehicle, fatally wounding him at about 2 p.m." Another article noted "The episode of highway rage transpired as quickly as two vehicles can cover two miles of highway."
Man Arrested For NRH Road Rage Shooting from the Dallas Ft. Worth CBS station web site which says: "According to police, Lambert explained that a dark-colored sedan pulled up close to the rear of his truck. The 30-year-old man tapped on his brakes to discourage the other driver from following so closely. But that driver then moved over to Lambert's left side. Lambert gave the other driver a hand signal, and the other driver returned his frustration with a gunshot into Lambert's truck. The sedan then left from the scene."
As I understand it, there's no way in my law abiding autonomous vehicle to get away from someone shooting at me. Or maybe they'll program that surprisingly frequent highway event into the vehicle as an _escape-from-danger-as-fast-as-the-car-can-go-laws-notwithstanding_ algorithm.

My somewhat silly point being that I doubt Google has programmed such an event and many other surprisingly frequent events into their car. If they are going to literally have a driverless vehicle they must effectively created AI that could favorably compete with a human brain.


----------



## James Long

Stewart Vernon said:


> They seemed to understand what I said about how if I can't drive it, why should I be liable for accidents... same as riding in a taxi or on a bus... beyond that, I still put my driving record up against these computer cars and I come out way ahead of them to date.


My insurance covers my vehicles and listed drivers. If I own a vehicle I pay for the insurance. If I own a vehicle I must register it and prove that I have a minimal level of insurance on that vehicle (public liability and property damage, not comprehensive). It does not matter if the owner drives the car or not ... the owner must insure the vehicle in case the vehicle hurts someone or damages property. (Your state may vary.)

I have not asked my insurance company if I can list my car as a driver on my policy. If I do, I'd still be paying for the insurance.

If you do not own the car you don't have to pay the car insurance. (I do not own a taxi, bus or Uber - despite the drunk people in Milwaukee who tried to get in my car on a street corner thinking I was their Uber.) If my rental car comes with a driver then they are responsible for their insurance (although I can buy supplemental insurance in case someone sues me if my driver causes an accident). I do not expect auto-driving car manufacturers to insure cars they do not own.


----------



## Tom Robertson

James Long said:


> It seems the self-driving car proponents want us to forget every potential problem. Sunshine and roses, all problems should be blamed on non-self driving cars or the road or minimized.
> 
> How is that self-driving Tesla doing? Who is responsible when one of those crashes in auto-drive mode?
> 
> We have talked about a lot in this thread ... one thing seems sure. There are a lot if issues for the auto-drive industry to address. We are not there yet.
> 
> According to the US DOT the average age of cars on American roads is 11.4 years. It takes time to replace cars with new models. Perhaps in five years half of the cars on American roads will be replaced ... with a low number of self-driving cars being sold penetration will take a lot longer than five years. We are years away from a noticeable presence let alone the market domination mentioned earlier in this thread.
> 
> The pie in the sky is still being baked.


Nah, the proponents do not, for an instant, say there are no problems. Some proponents have regularly and consistently brought up a problem near and dear to my heart--snow. (I am shifting, very slightly, on gravel. They might have solved that, they might not. I haven't seen an update in several months.)

Yet the proponents tire of the horse and buggy lovers repeated attempts to resurrect problems that were once real, yet have been solved, at least by Google. And their further attempts to manufacture false issues of ethics that happen so rarely in the real world that humans don't know what they would do--only what they would prefer to do--if they had even the slightest chance to recognize the issue in the first place.

Don't worry opponents to self-driving cars. No one is taking away your horses. You'll get to keep them and still be able to use them.

Lastly, the proponents aren't expecting yearly sales penetration of 50% in the rest of this decade. Various reports put that at 10 to 15 years out--mostly because of cost. Though the horse group can probably take some solace from Dr. Steven Shladover's estimate of 2075. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> Yet the proponents tire of the horse and buggy lovers repeated attempts to resurrect problems that were once real, yet have been solved, at least by Google.


The derogatory comments toward "opponents" of self-driving cars do not help the argument.
The issue isn't self driving cars vs horse and buggies. So please, cut the "horse and buggy" crap.

The "opponents" are not asking self-driving cars to fly. They only want them to do what the proponents claim they can do. Read the early part of this thread and you'll see claims of "autonomous" cars that are 100% self driving. You will see claims that "autonomous" cars will take over the roadways in just a few years. Claims that I am happy to see walked back to reality. There are limits on the technology.


----------



## Tom Robertson

James Long said:


> The derogatory comments toward "opponents" of self-driving cars do not help the argument.
> The issue isn't self driving cars vs horse and buggies. So please, cut the "horse and buggy" crap.
> 
> The "opponents" are not asking self-driving cars to fly. They only want them to do what the proponents claim they can do. Read the early part of this thread and you'll see claims of "autonomous" cars that are 100% self driving. You will see claims that "autonomous" cars will take over the roadways in just a few years. Claims that I am happy to see walked back to reality. There are limits on the technology.


To clarify, derogatory comments toward anyone, proponent or opponent, do not help.

Ok, some of the "horse and buggy" comments were derogatory. I apologize.

And some were expository. There are people who do keep horses to enjoy where cars can't go. Yet they also own cars. Similarly there will be people who keep non-self-driving cars for off-road use. While they own one or more self-driving cars. Or dual mode cars. Well after self-driving cars are the norm.

Self driving cars do not have to be 100% self-driving, according to the impossible metrics of perfection you've set, and yet still can be very useful and far safer than human driven cars. Cars have never been 100% of anything: safe, flawless, all terrain. They only have to be better than the alternatives at the time. Google seems to be almost there. GM says it will be there next year--on one model. Volvo is saying they have a truck available now. Tesla is not saying they have one now.

As to your other point, who is saying "will take over the roadways in just a few years?" Many of your comments have been directed at me or my comments, yet I've never said anything to suggest they will be the majority of the cars in a few years. In fact, I've explicitly said what I've read about that--ranging from 10 to 60 years. I suspect 10 is a bit early to be the norm of thems on the roads, yet by then might be the norm for thems being sold in 2025. Yet still not exclusively. Maybe in 2075 it is very difficult to buy a new non-self-driving car. Possible--with many permits or something. I don't know.

Peace,
Tom


----------



## yosoyellobo

James Long said:


> My insurance covers my vehicles and listed drivers. If I own a vehicle I pay for the insurance. If I own a vehicle I must register it and prove that I have a minimal level of insurance on that vehicle (public liability and property damage, not comprehensive). It does not matter if the owner drives the car or not ... the owner must insure the vehicle in case the vehicle hurts someone or damages property. (Your state may vary.)I have not asked my insurance company if I can list my car as a driver on my policy. If I do, I'd still be paying for the insurance.If you do not own the car you don't have to pay the car insurance. (I do not own a taxi, bus or Uber - despite the drunk people in Milwaukee who tried to get in my car on a street corner thinking I was their Uber.) If my rental car comes with a driver then they are responsible for their insurance (although I can buy supplemental insurance in case someone sues me if my driver causes an accident). I do not expect auto-driving car manufacturers to insure cars they do not own.


I walk in to a dealership and after kicking the tires a while the salesman ask me if I have any questions, my one question would be who would be paying for the insurance.


----------



## Tom Robertson

yosoyellobo said:


> I walk in to a dealership and after kicking the tires a while the salesman ask me if I have any questions, my one question would be who would be paying for the insurance.


And what exactly does it cover? Liability only when self-driving? Liability for any driving? What are the weasel clauses?

From what I've been reading, I expect we'll see all kinds of things possible as the insurance companies and manufacturers try to package this for sale. Bold manufacturers are already saying they will cover the liability (though I'd still want to see the details.)

Right now the insurance is mostly based on the value and performance of the car as well as the drivers. Will the insurance company add in metrics based on the manufacturer's testing of the self-driving?

Peace,
Tom


----------



## Drucifer

phrelin said:


> Almost anywhere in the United States. The most well-known recent incident is reflected in this headline Hundreds say goodbye to 4-year-old killed in road rage shooting.That was in New Mexico but if right now you do a Google news search for road rage shooting, for example, you would come up with many all over the country including:
> 
> Reward offered in fatal SW Detroit road rage shooting which tells us:"After a brief argument, the driver of the Cadillac followed behind Gergel and shot into his vehicle, fatally wounding him at about 2 p.m." Another article noted "The episode of highway rage transpired as quickly as two vehicles can cover two miles of highway."
> Man Arrested For NRH Road Rage Shooting from the Dallas Ft. Worth CBS station web site which says: "According to police, Lambert explained that a dark-colored sedan pulled up close to the rear of his truck. The 30-year-old man tapped on his brakes to discourage the other driver from following so closely. But that driver then moved over to Lambert's left side. Lambert gave the other driver a hand signal, and the other driver returned his frustration with a gunshot into Lambert's truck. The sedan then left from the scene."
> As I understand it, there's no way in my law abiding autonomous vehicle to get away from someone shooting at me. Or maybe they'll program that surprisingly frequent highway event into the vehicle as an _escape-from-danger-as-fast-as-the-car-can-go-laws-notwithstanding_ algorithm.
> 
> My somewhat silly point being that I doubt Google has programmed such an event and many other surprisingly frequent events into their car. If they are going to literally have a driverless vehicle they must effectively created AI that could favorably compete with a human brain.


I doubt your computer driven vehicle would cause road rage in another driver.


----------



## yosoyellobo

Drucifer said:


> I doubt your computer driven vehicle would cause road rage in another driver.


Why not. If anything it could even cause worst road rage. People subject to road rage don't seem to need an excuse.


----------



## yosoyellobo

Nissan might be entering the game.
http://nycity.today/content/286899-nissan-launches-self-driving-nissan-leaf-prototype-japan


----------



## Tom Robertson

yosoyellobo said:


> Nissan might be entering the game.
> http://nycity.today/content/286899-nissan-launches-self-driving-nissan-leaf-prototype-japan


Thanks, yosoyellobo.

According to their timeline, single lane highway end of 2016 (no lane shifts), multi-lane highway by 2018, and full self-drive by 2020.

Peace,
Tom


----------



## Tom Robertson

Google's latest report is posted: No accidents, more miles (duh), and a nice description of why they decided to skip semi-autonomous features straight to full self-drive. Basically it takes too long for humans to regain conscious control.

As always, they also have a nice list of articles and resources to look at.

Personally, I couldn't see anything but full self-drive or very specific driving assists. Cruise control is very nice and safety braking are helpful--yet still require the human driver. Once you start getting beyond requiring a wake driver, the car really needs to go all in for most things.

Self-Valet in a parking lot? Ok.
Parallel parking assist? Ok.
Highway single lane driving? Only if it can be done with a sleeping human.
Highway multi-lane driving? Same thing. Humans will do stuff if the car is in control. They will be on the phone, they will reach in back for the laptop/phone charger (really happened), they will do other things. Heck--they already do other things as the active driver.

While there are ways to verify the human is "watching", I can't see spending the money for the car to then require me to still be the safety driver. I want to let go.

Peace,
Tom


----------



## inkahauts

Yes but I'd like to still be able to override if I wanted too.


----------



## Tom Robertson

inkahauts said:


> Yes but I'd like to still be able to override if I wanted too.


That's still optional, at the discretion of the manufacturer. But the concept of a semi-self-driving car, where the driver must remain aware isn't going to work--drivers get too reliant and stop watching.

Yet dual mode, where either the car or the human can be the driver shouldn't be a problem. In fact, many people probably would prefer a dual mode at first. 

And until the weather related problems are solved, there may be no choice but dual mode. Though "the handoff" is going to always be a problem. Who would the car hand-off to if the only passenger is blind?

Peace,
Tom


----------



## Stewart Vernon

Bottom line for me and insurance... to beat that horse fully into submission... I have liability insurance today that covers other authorized drivers of my car... but if someone steals my car, and I report it stolen, I'm not liable for the accident that the thief might get into. Other drivers of my car are only covered by my liability IF they are of legal driving age, have a legal license to drive, and have permission from me to drive the car.

IF an automated car does not allow me to drive... then I'm never the one who decides who drives... and in that case, I'm not going to pay for liability insurance since I would in no way consider myself liable for decisions completely taken out of my hands. I would simply not own such a car. IF they tried to human drivers of any car, to force the issue, then I would just stop owning cars at that point.

Any company or directive that claims the computer driven cars are safer than human driven cars because humans make mistakes when given control that computers do not WILL be default be arguing that humans cannot be trusted to drive over the computer, which also by default seems to indicate the human can't be held liable. How could you legally hold the human responsible for an accident that you argued the computer should be in control of to prevent human mistakes? Legally, it pretty much writes itself that the manufacturer would have to be the one liable for any accidents of a computer driven car.


----------



## James Long

Stewart Vernon said:


> Other drivers of my car are only covered by my liability IF they are of legal driving age, have a legal license to drive, and have permission from me to drive the car.


One could argue that by owning a "auto driving only" car you are granting the car permission to drive. Similar to software licenses that you agree to by using the software or credit card terms that change or service contracts such as for satellite TV and cell phones. If you buy a car without a "manual" switch you assume the risk.



Stewart Vernon said:


> I would in no way consider myself liable for decisions completely taken out of my hands. I would simply not own such a car.


You are not alone. I expect that "fully automatic" self driving cars will be mainly used for cabs. Perhaps town cars or livery, although the owners still have to pay the driver to open the door for the high dollar passengers - might as well let them drive as needed.



Stewart Vernon said:


> Any company or directive that claims the computer driven cars are safer than human driven cars because humans make mistakes when given control that computers do not WILL be default be arguing that humans cannot be trusted to drive over the computer, which also by default seems to indicate the human can't be held liable. How could you legally hold the human responsible for an accident that you argued the computer should be in control of to prevent human mistakes? Legally, it pretty much writes itself that the manufacturer would have to be the one liable for any accidents of a computer driven car.


Have different people make each argument. Have the sales staff argue that the cars are super safe and the legal staff argue that the owner remains responsible.

If the car company assumes liability for the driving they will likely limit their liability to only when the car is driving. I still expect anyone who is injured to sue everyone involved: car company, passengers, owners. It is the American way.


----------



## inkahauts

I have a feeling this will work out one way and one way only... The maker will be liable if its a defect, just as they are now for defects.... and those defects will include the car not doing what it is designed to do. 

However, they wont be liable for things done that are outside its realm. An example would be if it hits a car because it couldn't stop fast enough because its driving on snow covered road and you didn't put the tie chains on.

Or an accident caused by bald tires that you didn't maintain and replace.

In theory, accidents shouldn't be your fault ever if you keep the vehicle in proper running condition. I get that you want that in writing in some way though. It sounds like California's laws will say you own the vehicle you are responsible for it, period. Which I think is to say that they are going to hold you responsible for making sure its kept in proper working order.


----------



## Tom Robertson

Stewart Vernon said:


> Bottom line for me and insurance... to beat that horse fully into submission... I have liability insurance today that covers other authorized drivers of my car... but if someone steals my car, and I report it stolen, I'm not liable for the accident that the thief might get into. Other drivers of my car are only covered by my liability IF they are of legal driving age, have a legal license to drive, and have permission from me to drive the car.
> 
> IF an automated car does not allow me to drive... then I'm never the one who decides who drives... and in that case, I'm not going to pay for liability insurance since I would in no way consider myself liable for decisions completely taken out of my hands. I would simply not own such a car. IF they tried to human drivers of any car, to force the issue, then I would just stop owning cars at that point.
> 
> Any company or directive that claims the computer driven cars are safer than human driven cars because humans make mistakes when given control that computers do not WILL be default be arguing that humans cannot be trusted to drive over the computer, which also by default seems to indicate the human can't be held liable. How could you legally hold the human responsible for an accident that you argued the computer should be in control of to prevent human mistakes? Legally, it pretty much writes itself that the manufacturer would have to be the one liable for any accidents of a computer driven car.


So we have several parts to the liability equation: 
1) Who drives
2) Who maintains (negligence liability)
3) How the other vehicle is insured.

*Who Drives*
1a) If you never drive and the manufacturer pays the liability, is there a problem?
1b) If you never drive and the insurance is much less than you pay now, is there a problem?
1c) If you occasionally drive, and the manufacturer still pays all the liability (unlikely, but roll with it), is there a problem?
1d) If you occassionally drive, and you pay liability based on how much you drive, is there a problem?
1e) if you drive all the time, and you pay insurance in the same range as now, is there a problem?

*Who maintains*
2a) If you properly maintain the vehicle, and the manufacturer covers the negligence portion of the liability, is there a problem?
2b) If you properly maintain the vehicle, and you pay a (presumed) small line item for liability for negligent maintenance, is there a problem? 
2c) If the manufacturer requires maintenance or it won't drive.

*Other vehicle insurance*
3) I'm presuming that under-insured motorist will still be required and still be part of our bills. (and relatively small.) Is this a problem?

This whole insurance question sounds like Fear, Uncertainty, and Doubt. Not so much about the insurance, but about the technology itself, masking as insurance concerns. Let's say insurance costs you the same amount it does today (nobody is predicting it will, by the way.) Why would that be a problem? You pay it now, you'd pay it then, get a better ride, safer ride, not have to be stuck doing the hard part. You'd give up on that just because you aren't in control of the insurance?

Or let's say insurance is not picked up by the manufacturer but the liability portion is only 20% of what you pay today. That's not enough of a savings?

I don't get insurance as a solid issue. Yes, I would demand the liability portion to be lower, since the car is safer. But I don't see cutting the nose off to throw out the bath water. 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> One could argue that by owning a "auto driving only" car you are granting the car permission to drive. Similar to software licenses that you agree to by using the software or credit card terms that change or service contracts such as for satellite TV and cell phones. If you buy a car without a "manual" switch you assume the risk.
> 
> You are not alone. I expect that "fully automatic" self driving cars will be mainly used for cabs. Perhaps town cars or livery, although the owners still have to pay the driver to open the door for the high dollar passengers - might as well let them drive as needed.
> 
> Have different people make each argument. Have the sales staff argue that the cars are super safe and the legal staff argue that the owner remains responsible.
> 
> If the car company assumes liability for the driving they will likely limit their liability to only when the car is driving. I still expect anyone who is injured to sue everyone involved: car company, passengers, owners. It is the American way.


There are reports that Google is thinking along the lines of fleets of self-drive taxi's. But I'm not certain those reports are fully accurate, as they just hired a car executive CEO.

I suspect there will be several cases: people who live in big cities will use the taxi model and people who live in smaller populated areas will still want their own cars (at least until the taxi model gets close enough to them and is shown to be less expensive than owning. I expect like any new technology, the people who will own cars will either ride in a taxi or a friend's car and realize it's so cool, they will want to switch to self-drive. "You mean I can legally shave, read, text, eat, put on makeup whilst driving? Sign me up! (And save money!)"

One of the points made by a couple sources is what will the insurance industry do with the drop in liability income? Can they adjust to making money on smaller liability, comprehensive, and under-insured packages? Then again, industries have had to change with technology before.

Peace,
Tom


----------



## inkahauts

I think the point is, why do I have to pay insurance if I am not driving. I don't pay insurance to ride the train, or the bus, or a taxi...

And if the computer makes a mistake that I have no control of that causes the accident, why should I be held liable for it. I think those are valid question.


----------



## Tom Robertson

inkahauts said:


> I have a feeling this will work out one way and one way only... The maker will be liable if its a defect, just as they are now for defects.... and those defects will include the car not doing what it is designed to do.
> 
> However, they wont be liable for things done that are outside its realm. An example would be if it hits a car because it couldn't stop fast enough because its driving on snow covered road and you didn't put the tie chains on.
> 
> Or an accident caused by bald tires that you didn't maintain and replace.
> 
> In theory, accidents shouldn't be your fault ever if you keep the vehicle in proper running condition. I get that you want that in writing in some way though. It sounds like California's laws will say you own the vehicle you are responsible for it, period. Which I think is to say that they are going to hold you responsible for making sure its kept in proper working order.


California is still looking into all the regs about self-driving cars, including liability.

You've nailed several of the key components to what the insurance bill of the future might have. The current California law absolves the original car manufacturer of liability if a 3rd party installs self-driving technology. (No word on if the installer/self-drive manufacturer is liable yet.)

Though I expect the final self-drive package will be able to drive safer in sudden icing conditions than a human could. Partly by better anticipation of road conditions and partly by better sensing the traction instant to instant.

Peace,
Tom


----------



## Tom Robertson

inkahauts said:


> I think the point is, why do I have to pay insurance if I am not driving. I don't pay insurance to ride the train, or the bus, or a taxi...
> 
> And if the computer makes a mistake that I have no control of that causes the accident, why should I be held liable for it. I think those are valid question.


If you think about insurance as the price of owning any car, and the price is lower with self-drive than without, would anyone really quibble enough to not use the technology? Or is the fear it won't go down? Or that if the car does have a rare accident, that like current insurance, the rates will go up?

I get that we want the insurance to go down. It should and by a bunch. But what is the real fear? Sounds more like fear of losing control, not fear of the insurance.

Peace,
Tom


----------



## inkahauts

Well if it dropped insurance I'd be fine with it. I'm not fine with staying the same but that's mainly because I'm not fine with my insurance rates anyway. They are way to high and I'm paying for others who have accidents or don't have any insurance at all, there's no question about that.


----------



## James Long

Tom Robertson said:


> 1b) If you never drive and the insurance is much less than you pay now, is there a problem?





Tom Robertson said:


> 1c) If you occasionally drive, and the manufacturer still pays all the liability (unlikely, but roll with it), is there a problem?





Tom Robertson said:


> This whole insurance question sounds like Fear, Uncertainty, and Doubt.





Tom Robertson said:


> Or let's say insurance is not picked up by the manufacturer but the liability portion is only 20% of what you pay today.


You seem to be working from the assumption that liability insurance will be much less ... up to 80% less than the current liability portion of insurance payments. I will believe you ONLY when I see it.

Tiny discounts for safety features (air bags, anti-lock brakes, etc) - I see that. I get those discounts on my car and they are small. Massive discounts for being an automatic drive car? I would not count on it.

I would count on more expensive vehicles being more expensive to insure. That I have seen in real life and not just theorized on the Internet. Perhaps the alleged "safer car discounts" will balance out the "more expensive car" fees. But 80% less. I don't think so. That is way too optimistic.

If the car manufactures offer liability insurance as part of their sales/maintenance price the owner is still paying for the insurance. Nothing is free.


----------



## James Long

Tom Robertson said:


> "You mean I can legally shave, read, text, eat, put on makeup whilst driving? Sign me up! (And save money!)"


Not yet. States need to recognize the car as the driver and allow the driver to be considered a passenger before distracted driving changes legal status.



Tom Robertson said:


> One of the points made by a couple sources is what will the insurance industry do with the drop in liability income? Can they adjust to making money on smaller liability, comprehensive, and under-insured packages? Then again, industries have had to change with technology before.


Easy enough ... don't give up the revenue. Keep charging people regardless of how "safe" the new cars allegedly are. If the autonomous car industry proves that their cars are safer with a history of a few million cars on the road and a billion miles of travel perhaps some company will lower their rates and others may follow. The insurance industry works on actuality not promises.


----------



## Tom Robertson

James Long said:


> Not yet. States need to recognize the car as the driver and allow the driver to be considered a passenger before distracted driving changes legal status.


Since your original premise was about a time in the future when states recognized self-driving cars, this point seems moot as my reply was when that does occur, people will be able to actually ride in the cars the states have approved... 



James Long said:


> Easy enough ... don't give up the revenue. Keep charging people regardless of how "safe" the new cars allegedly are. If the autonomous car industry proves that their cars are safer with a history of a few million cars on the road and a billion miles of travel perhaps some company will lower their rates and others may follow. The insurance industry works on actuality not promises.


Silly insurance companies... What are they worried about if they can just keep the rates high in light of reduced costs? 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> You seem to be working from the assumption that liability insurance will be much less ... up to 80% less than the current liability portion of insurance payments. I will believe you ONLY when I see it.
> 
> Tiny discounts for safety features (air bags, anti-lock brakes, etc) - I see that. I get those discounts on my car and they are small. Massive discounts for being an automatic drive car? I would not count on it.
> 
> I would count on more expensive vehicles being more expensive to insure. That I have seen in real life and not just theorized on the Internet. Perhaps the alleged "safer car discounts" will balance out the "more expensive car" fees. But 80% less. I don't think so. That is way too optimistic.
> 
> If the car manufactures offer liability insurance as part of their sales/maintenance price the owner is still paying for the insurance. Nothing is free.


Yes, I, industry analysts, and insurance groups are all working from the assumption that safety will go up thus insurance payouts will go down, thereby causing insurance rates to go down. Massive reductions? I picked some targets, some from the air, some from various reports. What will happen, probably will be foreshadowed... 

As for nothing is free? Who said it was? But leverage can create opportunities for lower total operating costs to the consumer. Which manufacturers will happily use as a selling point. 

So it seems thems who are resisting the obvious _might _be spreading Fear, Uncertainty, and Doubt without merit. Not that all the issues are fixed today. That is not the claim. Yet the writing is on the green screen--color motion pictures will be the future of movies... 

Peace,
Tom


----------



## inkahauts

Like most things any reductions I expect to be not seen on the consumer end because they will simply be considered that years price hikes and such, so to speak...


----------



## James Long

Tom Robertson said:


> Since your original premise was about a time in the future when states recognized self-driving cars, ...


Not my premise. And the assumption that everything will work out glowingly for the car industry gets us back to the "pie in the sky" territory.



Tom Robertson said:


> So it seems thems who are resisting the obvious _might _be spreading Fear, Uncertainty, and Doubt without merit. Not that all the issues are fixed today. That is not the claim. Yet the writing is on the green screen--color motion pictures will be the future of movies...


I wish there was a derogatory term I could use for the overly optimistic view the pushers of self-driving vehicles. The opposite of your "FUD" label.

There is no fear ... I am not invested in the "autonomous" industry nor do I recommend nor not recommend any investment in the industry. The uncertainty is clear ... even reading your posts you don't KNOW - you just paint a rosy picture. I'll own the doubt. I have faith in what I have seen - and I do not see insurance companies offering discounts for autonomous vehicles.

If you see this in real life and not just a crystal ball view of life please let us know.


----------



## Stewart Vernon

Tom Robertson said:


> This whole insurance question sounds like Fear, Uncertainty, and Doubt. Not so much about the insurance, but about the technology itself, masking as insurance concerns. Let's say insurance costs you the same amount it does today (nobody is predicting it will, by the way.) Why would that be a problem? You pay it now, you'd pay it then, get a better ride, safer ride, not have to be stuck doing the hard part. You'd give up on that just because you aren't in control of the insurance.


For me, it's both. I am afraid that the computer really will not be as safe as I am today as a driver. As I've already said, I'll put my driving record up against the best computers they have and I'm miles ahead of them based on what I've read. But, let's say for the sake of argument that the technology one day exists that would drive safer than I would. Ok. IF the whole argument is that the computer would drive safer than I would... then why would I be held liable at all?

I pay for liability now because *I* drive the car OR I at least decide who to giver permission to drive if I'm not driving... You can't fairly compare computer driver to me and say since I'm paying today, why not continue to pay... I pay today because I drive the car. IF I never wanted to drive a car, then I wouldn't have to carry liability insurance to ride in a cab, bus, limousine service or have a friend drive me somewhere.

I would expect that a fully "autonomous" computer-driven car would not require me to be liable for its decisions. When the whole argument for the computer is that it makes better decisions than humans, the "money where your mouth is" expectation should be that the manufacturer would stand by their product and not require me to carry liability insurance for accidents their computer might make... especially if they are saying they don't expect it to ever make one!



inkahauts said:


> I think the point is, why do I have to pay insurance if I am not driving. I don't pay insurance to ride the train, or the bus, or a taxi...
> 
> And if the computer makes a mistake that I have no control of that causes the accident, why should I be held liable for it. I think those are valid question.


Exactly!


----------



## Tom Robertson

Today's lesson in liability: http://kutv.com/news/get-gephardt/that-pizza-you-ordered-just-came-with-a-huge-insurance-liability. (Driving for work, in any fashion, might be excluded from your insurance.)

Liability is more than just "I caused an accident." It is for "my car participated in an accident," with minimal concern as to who was driving. It covers times when the car breaks and gets into an accident. It covers when you choose to drive when conditions are not suitable to be out. It covers a grease spot on the pavement that caused a slide. And it covers when you make mistakes.

There are costs for liability that aren't driver based and those will be covered by us in some way. James is absolutely right, there is no free lunch. He's also right when he says some safety devices have reduced liability payouts and sometimes those are passed back to us. (Or rates don't rise as fast.) Cars are much safer now--because of the insurance industry work on safety. Death payouts are big, crunched car payouts are comparatively small.

So I see liability as many parts, one of which will continue to go down--but not be totally eliminated. I suffer no "rose-colored glasses" in that. 

Will manufacturers pay for the "driver" portion of liability? Perhaps, certainly we're seeing the sales side working on that part. Will the insurance companies work out new packages? I'm sure they will--they are already thinking about it as they rework the California regs. I'll try to find that link again.

The non-driver parts of liability--your choice to be out, the grease spot, the car mechanical failures, et al, will be covered by us. Perhaps in a new rider replacing the old "liability", perhaps as a safe-self-driving car discount as James suggested, or something else. States, insurance companies, and finance companies, together, will have to determine what works from both legal and marketing standpoints.

One other aspect of "there ain't no such thing as a free lunch," is you still pay liability in a taxi, a bus, a train, etc. It is built into the cost structure of the fare. Just as James reminded us it would be built into the cost structure of the car if the manufacturers pay for it. That said, leverage and economies of scale come into play. A taxi company probably pays less liability per mile driven than we do. A manufacturer would probably also pay less per mile than we do. So that all could help, even though it won't eliminate the cost. Nothing will do that.

Peace,
Tom


----------



## Tom Robertson

Stewart Vernon said:


> For me, it's both. I am afraid that the computer really will not be as safe as I am today as a driver. As I've already said, I'll put my driving record up against the best computers they have and I'm miles ahead of them based on what I've read. But, let's say for the sake of argument that the technology one day exists that would drive safer than I would. Ok. IF the whole argument is that the computer would drive safer than I would... then why would I be held liable at all?


I don't understand how you are ahead of Google's 1.2M miles with no at fault. Yeah, I could definitely see being in the same general ballpark, but perfect is still pretty good for both of you. 

Peace,
Tom


----------



## Tom Robertson

There are people who like to drive, there are people who don't. Some who drive to relax, some who think of it as a waste of time or don't relax from driving. (Where and when you are forced to drive probably plays into it, as well as expectations.) 

There are people who want control, others who get enough control by giving it up at times. 

Nothing derogatory implied by any of the above people. We're all different.

From time to time our society tends to consider in derogatory fashion people who overly fear or embrace a future, be it technology, money, government, what have you. Because in our past we've learned the overly feared or embracers are both wrong--to a degree. They are both right--to a degree.

Nuclear power didn't solve our energy problems. It has promise, even still, yet, some problems haven't been worked out yet. Other technologies have worked well, even though there were some who said they would never work.

"Peoples is peoples." It's all good.

Am I excited by self-driving cars? You betcha.

I no longer find driving anything but a waste of time. Long, for me, commutes on busy highways dispelled that. Half day driving trips across the midwest, started that. I'd much rather be doing something that engages my mind than driving, which doesn't. Or escaping into a book, movie, etc. But driving--waste for me. 

So I've long wanted to have a chauffeur. Mrs. Tibber used to drive me to the BART station when we had one car and my office was served by a BART stop in the building. (Very nice commute!)

But, it really wasn't until I saw the Google TED talk I've linked where I really, really got excited by self-driving cars. Previously I had thought, "yeah, sure someday." I know enough about computer systems to know many of their foibles.

What I see in the video is a computer system that is very well designed to learn and abstract similar things and fit those into categories of stuff. I see a computer system that has undergone years of testing already. And a few more years are planned--a shift from their original thinking, when they realized drivers won't pay attention to the road if the car does any of the driving beyond cruise control.

In that video I see a system of sensors that already "see" better than humans. And with seeing comes anticipation--which they also already do--without distraction.

In that video I see a remarkably cautious approach to solving the problems. And that many have already been solved. No, not all problems. No one is saying that--or we'd see them on the market today. 

Yet, I'm blown away by how close they are to cars that could be used in clear weather streets today. I had no idea they were this close.

Issues that remain (as far as I know):
1) State regulations (actively being worked by California, lots of thought these will become a framework for most states--that's how regs typically work.)
2) Insurance packaging (also actively being worked on by insurance organizations, manufacturers, and state regs.)
3) Obscured lane markers (snow, gravel, etc. I know they are being work on, though I have no clue where they are with this.)
4) Cost. Sensors that "see" better than humans ain't cheap. Yet we already have examples in the market today. 
5) Verification/Testing. I break this out separately from regs as it is a big question. There have been inputs to California's DMV as to how to verify and test, but I'm not sure a solution has been truly embraced yet.

One problem that comes up is navigation, but that really has nothing to do with "driving." No matter the mode of travel: bike, bus, taxi, human drive, or self-driving-car, the human needs to know and/or adapt to where he/she is going. The self-driving cars are expected to know how to get most places, but it always has been the responsibility for the passenger to know how to tell the driver how to get there. That doesn't change when the passenger switches from _being_ the driver to the car as driver. (Ok, a human chauffeur might be held to a higher standard than all--he's the paid professional.) 

So, yes I am tremendously excited by self-driving technology. I know I get distracted as a driver--and I know a well built computer would be much safer than I. I've watched many drivers out there that would be safer if not driving... 

Perhaps some timeline expectation clarification will help:
When will self-driving cars be "required on most city streets and highways"? Maybe never, maybe not until 2075, maybe I don't know and don't really care. I don't fear it and know that will take MANY years before the collection of human drive cars are reasonably replaced.
When will human driving cars cease to exist? Never. Just as there are people who own animals for recreation, there will be some form of human drive vehicles for other forms of recreation.
When will fully self-drive cars be available? GM says next year, others are saying 2017, Google is targeting by 2019 or sooner. California was to have the regs by last January. Sounds like they have some drafts, yet need more info from manufacturers.

Do I think I'll own a self-driving car in 2016? No. The Chevy Volt is not on my radar as a car to own. Nor are the sedan Tesla's. Not sure what the other manufacturers are planning in model styles. I see 2017 as unlikely yet there still is a chance, with 2018 or 2019 as more likely. I'd be happy to be surprised with earlier than later. 

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> I don't understand how you are ahead of Google's 1.2M miles with no at at fault. Yeah, I could definitely see being in the same general ballpark, but perfect is still pretty good for both of you.
> 
> Peace,
> Tom


"No fault" is a curious thing. I've been driving for 55 years and experienced two accidents.

One, at the beginning of my driving career, was not my fault but might have been written up differently by the Highway Patrolman had it not been witnessed by an insurance accident investigator who just happened to be at the intersection. There was literally nothing I nor a Google car could have done to avoid being hit from behind in that accident.

The other, about 45 years ago, was my fault - I knew black ice was a possibility but didn't take due care, unknowingly drove onto a 40' stretch of black ice, and couldn't avoid hitting the rear of a car stopped at a stop sign even though i was only going about 15 mph. I'm not sure what the Google car would have done as I did everything I was taught about being on ice.

I will keep saying that Google is not an auto manufacturer selling millions of cars in a mass market. In that context IMHO they are doing pure or basic research "aimed to improve scientific theories for improved understanding or prediction of natural or other phenomena."

Tesla, on the other hand (and I'm being generous) is engaged in applied research, "a form of systematic inquiry involving the practical application of science" which "accesses and uses some part of the research communities' (the academia's) accumulated theories, knowledge, methods, and techniques, for a specific... business ... purpose."

IMHO Google may very well make significant contributions to efforts to develop Artificial Intelligence. Tesla may very well make significant contributions to death and destruction on our highways.


----------



## Tom Robertson

phrelin said:


> "No fault" is a curious thing. I've been driving for 55 years and experienced two accidents.
> 
> One, at the beginning of my driving career, was not my fault but might have been written up differently by the Highway Patrolman had it not been witnessed by an insurance accident investigator who just happened to be at the intersection. There was literally nothing I nor a Google car could have done to avoid being hit from behind in that accident.
> 
> The other, about 45 years ago, was my fault - I knew black ice was a possibility but didn't take due care, unknowingly drove onto a 40' stretch of black ice, and couldn't avoid hitting the rear of a car stopped at a stop sign even though i was only going about 15 mph. I'm not sure what the Google car would have done as I did everything I was taught about being on ice.


As I recall, in autonomous mode, the Google cars have been rear-ended 15 times, so they are considered not at fault of the google car. The other, if I am remembering correctly, was being hit by a car from the side, similarly in a situation where there was simple avoidance by any driver. (I might have to double check on that last one--I could be confusing this with one of the times the car was being human driven.)

You raise an excellent point about black ice--from a different angle than I've been considering so far. I'm not sure how much black ice conditions Google has tested. Perhaps a lot--they started with highway tests, yet perhaps they've not done much in ice. Right now, in Mountain View and Austin, they aren't likely to see much ice. 



phrelin said:


> I will keep saying that Google is not an auto manufacturer selling millions of cars in a mass market. In that context IMHO they are doing pure or basic research "aimed to improve scientific theories for improved understanding or prediction of natural or other phenomena."
> 
> Tesla, on the other hand (and I'm being generous) is engaged in applied research, "a form of systematic inquiry involving the practical application of science" which "accesses and uses some part of the research communities' (the academia's) accumulated theories, knowledge, methods, and techniques, for a specific... business ... purpose."
> 
> IMHO Google may very well make significant contributions to efforts to develop Artificial Intelligence. Tesla may very well make significant contributions to death and destruction on our highways.


While I generally agree with your thoughts on pure and applied research, I suggest that Google is signalling a transition from only being pure research to being both pure and applied. The first signal might have been in designing their first car. They learned a lot about building cars at that juncture. Yet there was still a lot of pure research there too.

The second, perhaps more clearer signal is the hiring of a CEO from the car industry. They now are moving this from a "work on it in a lab" to having a corporate structure, not a team lead, to having real car experience, rather than a visionary technologist at the top. So I see what looks like a transition point.

Google could become like ARM, a designer of smart car technology rather than a manufacturer of smart cars, or they could be both (like the nexus and android phones, or they could transition to a car maker. (I suspect the nexus concept as most likely, but we'll see.) 

Tesla... Obviously they did some pure research--and now applied it--and I don't know if they were surprised by the results. I have little interest in a car that still requires me to be aware, awake, and driving, even if I'm not driving. So I'm not sure why anyone is surprised. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> I don't understand how you are ahead of Google's 1.2M miles with no at fault. Yeah, I could definitely see being in the same general ballpark, but perfect is still pretty good for both of you.


Stewart (and I) are part of a group of drivers who have logged billions of miles. Google's group of drivers (whether one counts cars or aggregates all the miles as "one driver") has logged far less miles than humanity.

I am unapologeticly pro human. You can have the anti-human label if you wish.


----------



## Tom Robertson

James Long said:


> Stewart (and I) are part of a group of drivers who have logged billions of miles. Google's group of drivers (whether one counts cars or aggregates all the miles as "one driver") has logged far less miles than humanity.
> 
> I am unapologeticly pro human. You can have the anti-human label if you wish.


Thus you earn the "Cut the Crap" award for the day: You haven't driven billions of miles, nor does your learning improve with other people driving billions of miles. "Cut the Crap."

You may have driven a million or two miles. You may have had zero at fault accidents. But that doesn't put you way ahead of google. Both are awesome records, by the way.

I'm neither for nor against humans. I'm all for humans where humans excel, computers where humans don't excel, and options that allow us to reasonably choose.

And there comes a time when "pro-human," taken too far, seems indistinguishable from "anti-tech" and therefore seems reminiscent of other times people tried to deny the future was coming...

Peace,
Tom


----------



## James Long

Tom Robertson said:


> There are people who like to drive, there are people who don't.
> ...
> I no longer find driving anything but a waste of time.


I drive 15,000 to 20,000 miles per year. I used to drive more. About 5,000 of that is getting to work. It can be tedius ... but I believe road improvements would help more than tuning out and letting the car drive. I live a mile from public transit and (per Google) 24 driving minutes from work. The sad part is that half of that time is spent in a traffic jam half a mile from work.

I suppose that an autonomous vehicle would work for that drive ... and the concept of having "valet" service that would drop me at the door the go find a parking slot (up to a quarter mile away in my case) and return when I want to leave would be "cool". Make it happen. Then make it affordable. I normally buy cars that are over five years old then run them into the ground. (My last three trade ins I was the last owner. I put 200k on my only new car. Cars come to me to die.) When will there be a used autonomous car in my price range?


----------



## James Long

Tom Robertson said:


> And there comes a time when "pro-human," taken too far, seems indistinguishable from "anti-tech" and therefore seems reminiscent of other times people tried to deny the future was coming...


I said you could have the anti-human label ... not that you could call others anti-tech. :lol:

But that is the problem here ... just like pro-life vs pro-choice ... everyone wants their position to be positive. I am not against the autonomous car --- I just do not see it in as positive of light as it is being pushed.

The insurance companies are more likely to see it my way ... aggregate number of miles driven in that mode. Not unfairly comparing aggregate miles for autonomous vs individual miles for a human. But a track record based on more than a number of miles expressed in millions. The insurance companies will know more when autonomous reach the billion mile mark.


----------



## Tom Robertson

James Long said:


> I drive 15,000 to 20,000 miles per year. I used to drive more. About 5,000 of that is getting to work. It can be tedius ... but I believe road improvements would help more than tuning out and letting the car drive. I live a mile from public transit and (per Google) 24 driving minutes from work. The sad part is that half of that time is spent in a traffic jam half a mile from work.
> 
> I suppose that an autonomous vehicle would work for that drive ... and the concept of having "valet" service that would drop me at the door the go find a parking slot (up to a quarter mile away in my case) and return when I want to leave would be "cool". Make it happen. Then make it affordable. I normally buy cars that are over five years old then run them into the ground. (My last three trade ins I was the last owner. I put 200k on my only new car. Cars come to me to die.) When will there be a used autonomous car in my price range?


I take the challenge to guess when someone, who buys 5 year old and older cars might first find a self-driving car in his/her price range:

Assuming first self-driving cars won't be models or styles you'll be interested in. (I'm not terribly interested in sedans, so I don't think I'll like the first self-driving models either.)
Assuming first self-driving cars will be available in 2017, but still expensive.
So add a few years for model availability: 2020
Add a few years for cost reduction: 2023 
Add 5 years, minimum: 2028

Add "forever" as a holdout to a "simpler time": Not in this lifetime. 
Subtract "reality sets in as one gets too old to actually drive anymore": Not sure. How old are you?  By the way, may you live healthy, happy, and strongly for as long as you want.

There we have it. 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> I said you could have the anti-human label ... not that you could call others anti-tech. :lol:
> 
> But that is the problem here ... just like pro-life vs pro-choice ... everyone wants their position to be positive. I am not against the autonomous car --- I just do not see it in as positive of light as it is being pushed.
> 
> The insurance companies are more likely to see it my way ... aggregate number of miles driven in that mode. Not unfairly comparing aggregate miles for autonomous vs individual miles for a human. But a track record based on more than a number of miles expressed in millions. The insurance companies will know more when autonomous reach the billion mile mark.


I'll take the argument another way: when is one a "fan-boy"? Typically when, as accusations have been tossed about, by repeated insistence there are "no problems" by the subject. (I've not only acknowledged some problems, I've continued to list them as on the radar.)

So how does one potentially earn an anti-tech label? Left as an exercise to the reader.  (Repetition does seem to be a part.)

Insurance companies have the data today to aggregate based on millions per model. (And car color, driver age, driver county, probably driver eye color, bank balance, number of kids, etc.) So if Google comes in and shows them data on millions of miles on two models, the insurance companies will have enough to set initial pricing. And, understandably, will aim a bit high to be safe (and profitable.)

From there prices will adjust as more data arrives--as it has with the introduction of each safety measure. Especially the passive ones.

Peace,
Tom


----------



## James Long

The current "per model" estimates are based on models that have grown out of the history of vehicles.

Google may be able to prove that their cars have enough of a safety record to avoid a surcharge for being a new type of model with a limited history.


----------



## Tom Robertson

James Long said:


> The current "per model" estimates are based on models that have grown out of the history of vehicles.
> 
> Google may be able to prove that their cars have enough of a safety record to avoid a surcharge for being a new type of model with a limited history.


All around, the first self-drive cars will not be lower end of the market. There is no doubt they are going to be in the upper price ranges--which fits a general pattern of all new technology.

So the base car cost is likely to be the larger determining factor in the insurance pricing than the lack of data on the model itself. By the time the insurance guys review the data, they will also have a knowledge of the heritage of the "car technology" as well as the "driver technology." By that I mean if Google designs a new car from the ground up vs. hiring an existing manufacturer/model.

For the driver technology, there will be a "surcharge" in that they won't lower the price as much as their data might suggest. 

Thinking one more step beyond the direct question is the who will pay for the "driver" liability? If the car makers do, at least initially, their economy of scale could easily change the insurance rates much more drastically than even the car value itself. (But car value will still set the comprehensive and underinsured motorist rates, and thus be toward the higher end.)

And early adopters--won't care. 

Peace,
Tom


----------



## yosoyellobo

If the insurance companies don't want to play and the cars companies are sure enough of their techology, they might just self insure. I would bring in the caveman as their spokesperson.


----------



## Tom Robertson

yosoyellobo said:


> If the insurance companies don't want to play and the cars companies are sure enough of their techology, they might just self insure. I would bring in the caveman as their spokesperson.


Yeah, I wondered about this myself. I'm not sure how car companies insure for the liability they already carry.

Another thing I wonder about is after the "driver liability" is covered, how much "other liability" would the owner still be responsible for. Not in so much as how much coverage does one need, but the cost of the driver liability vs. the cost of the rest of the liability factors. I'm sure the insurance companies have complex models that know such--I'm curious. 

Peace,
Tom


----------



## Tom Robertson

Yup--"assisting" tech has some problems--like people: http://gizmodo.com/elon-musk-tesla-may-limit-autopilot-because-people-are-1740527417



> On an earnings [call] yesterday, Tesla CEO Elon Musk finally responded to "some fairly crazy" Model S Autopilot videos that show reckless idiots pushing the feature beyond where it's supposed to go. He isn't pleased.
> According to Musk, "This is not good."


Though he does say the tech has prevented some accidents and has not caused any. (Yeah, I'll add it--yet.)

The feature does have a price tag now--$3,800 when it moves beyond the beta testing stage.

Peace,
Tom


----------



## James Long

The liability issue should not be hard to answer in today's terms.

My car insurance policy lists vehicles and drivers ... with a primary driver for each vehicle. The insurance is based on the age and type of vehicle, the age and type of driver and how that car is driven (miles to work, miles for work, miles per year, accident and ticket history). All calculated from actuarial tables that estimate how much risk they are taking with that car and driver.

I assume that you also have insurance with named drivers and cars. What happens when you drive my car? Are you covered by my insurance or yours? When I drive my employer's vehicles I am covered by their policy. I am a named driver on the company insurance. You are not a named person on my personal car insurance ... am I covered when you drive? Yes. I can occasionally let someone else drive my car. If you drove regularly I would have to name you on my policy and pay for a person of your description. Your insurance may also cover you when driving someone else's car. (I do not have the 8pt font of my policy handy but I believe I am also covered by my insurance when I drive rental cars. I don't drive rentals enough to care.)

That is how I approach the insurance/liability issue. If I occasionally "let the car drive" that is fine and my insurance covers me. If I regularly "let the car drive" I will need to name the car as a driver and be at the mercy of the insurance company's actuarial tables. Tables that have a hell of a lot more data for "married male" in my age bracket than they do for "autonomous car".

If the car manufacturers make a deal with the insurance companies and offer to pay for liability when the car is driving they can do so ... in my case I would not list the car as a driver and my insurance would not cover autonomous driving. I would, in effect have two insurance policies - one for when the car is driving and one for when I (or another named driver) is driving. Details that will be worked out - along with licensing cars as drivers and legal recognition that a car can drive itself.


----------



## Tom Robertson

James Long said:


> The liability issue should not be hard to answer in today's terms.


Kinda what I've been thinking. 



James Long said:


> My car insurance policy lists vehicles and drivers ... with a primary driver for each vehicle. The insurance is based on the age and type of vehicle, the age and type of driver and how that car is driven (miles to work, miles for work, miles per year, accident and ticket history). All calculated from actuarial tables that estimate how much risk they are taking with that car and driver.


As you can guess, there are specific legal requirements for insuring guest drivers. Though most insurance is not required to cover my driving a rental car unless the car I'm listed as a driver is idle at the time I'm driving the rental. If both are in use, only one car is insured. (And my insurance company will gladly sell me another package to cover when both the primary car and a rental car are in use.) 

While listing the autonomous unit as a driver is an interesting concept, just knowing the make/model will suffice. If the manufacturer covers the autonomous mode it will be calculated in the pricing for that model, I should think. 

I have seen some discussion of how much more data intelligent cars can record and what privacy can be ensured or waived, and under what circumstances. Could you be charged for insurance by the mile of human driving mode? That might be ok, but could they also obtain driving habit information they shouldn't have access to? And in an accident, what information is germane and what is off limits?

Lots of discussions going on. The insurance trade groups, law scholars, business people, state administrators, etc. It's been an interesting read. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> As you can guess, there are specific legal requirements for insuring guest drivers.


Certainly ... I live in one of the states that requires minimum coverage (PLPD) on every car driven. An insurance company could refuse to insure a specific driver (back in my wild days Geico refused to insure my parents unless I was removed from their policy and forbidden from driving their car). As long as it is disclosed perhaps they could refuse to cover guest drivers ... but I would not buy such insurance. (Is my mechanic a guest driver? A valet would be a guest driver.) One would need to check state statutes to figure out if a policy would be legal to bar guest drivers. (As noted in my previous post, calling a regular driver a guest would be forbidden on my policy. Anyone who drives on a regular basis MUST be listed.)



Tom Robertson said:


> Though most insurance is not required to cover my driving a rental car unless the car I'm listed as a driver is idle at the time I'm driving the rental.


Wow. I hope my policy is not that restrictive. There are times where one may need to drive a rental at the same time their car is being driven (obviously by someone else). I suppose the key is not to wreck both cars at the same time.



Tom Robertson said:


> While listing the autonomous unit as a driver is an interesting concept, just knowing the make/model will suffice. If the manufacturer covers the autonomous mode it will be calculated in the pricing for that model, I should think.


Calculated as "we do not cover the autonomous driver" ... just like Geico told my parents. Your insurance is invalid if your son is driving. Perhaps reduced to the minimum allowed by law. Listed driver, full coverage - autonomous driver, PLPD.


----------



## Drucifer

James Long said:


> Not yet. *States need to recognize the car as the driver* and allow the driver to be considered a passenger before distracted driving changes legal status.


Just caught the news bit and not the actually story, but apparently riding an electric bike in NYC is illegal. When & why that law got on the books is a mystery to me as I find it strange that it is even there.

But it just to point out it is going to be nearly impossible to get every locale to agree what is legal.


----------



## Tom Robertson

An update about Nissan: https://www.yahoo.com/autos/on-nissans-road-to-self-driving-cars-a-speedy-223355115.html

The article says they are ahead of their schedule for a 2020 launch of a car that will require human oversight.

They also talk about Nissan's suggestion of LED status for others to know which mode the car is in.

Peace,
Tom


----------



## Stewart Vernon

Also... there's liability and "liability."

I grant you that taxi and bus fares presumably account for their liability insurance costs, so in that sense I am "paying" but at least I'm only paying per-ride and not on a monthly basis whether I ride or not. BUT, more important to me than just the cost of insurance is ACTUAL liability.

IF I'm riding in a taxi and the taxi driver rear-ends someone... I have no legal liability in that scenario. Same goes for if I'm riding in a car with a friend who wrecks someone... I can't be sued as a passenger for the role of the driver in the accident. But what happens with the automatic car? I own it... but I wasn't driving it... maybe I wasn't even able to drive it because the car doesn't allow human drivers... so could I be sued if my car gets into an accident and severely injures someone? IF the answer is "yes" then I'd rather be driving so that at least I really was liable and not just "liable."

As I alluded to earlier too... my insurance covers other legal drivers IF I give them permission to drive my car... but not if my car is reported stolen... so what happens if someone steals my automatic computer-driving car? Am I still liable for that? I never drive it ever, even when I'm in it... Am I liable if I give permission for someone else to ride in it without me?

It's going to be a legal quagmire I think unless the manufacturers step up and declare themselves legally liable, and I seriously doubt they would ever sign up for that.


----------



## James Long

Stewart Vernon said:


> IF I'm riding in a taxi and the taxi driver rear-ends someone... I have no legal liability in that scenario. Same goes for if I'm riding in a car with a friend who wrecks someone... I can't be sued as a passenger for the role of the driver in the accident.


Rule of thumb: If you have money, you can be sued. Some will sue even if you do not have money. Whether or not the suit is successful is a separate issue. But if you have any connection to the accident someone could sue you.

Umbrella coverage helps in those situations. Insurance to cover what other insurance does not cover (or when claims exceed coverage).



Stewart Vernon said:


> As I alluded to earlier too... my insurance covers other legal drivers IF I give them permission to drive my car... but not if my car is reported stolen... so what happens if someone steals my automatic computer-driving car? Am I still liable for that?


I would not count on not being liable for a stolen car. As noted above, if you have any connection to an accident someone could try to hold you responsible. Your insurance may not cover it ... but not being covered by insurance does not mean you are not liable.

Can an autonomous car be stolen? Especially one without a manual mode? What is the thief going to do, hold a gun to the car's processor and shout "drive"? (Or threaten the car with a large magnet or EMP?)



Stewart Vernon said:


> I never drive it ever, even when I'm in it... Am I liable if I give permission for someone else to ride in it without me?


The answer would depend on how coverage works out for you in the "I never drive" mode. If the car company provides liability insurance in autonomous mode they should (magic word) accept liability regardless of who the passengers are. If you own the vehicle then you have liability regardless of who the driver is. (If your insurance refuses to cover a driver you better get coverage from another insurance.) As the car owner you are likely to be held liable for anything that the car does, regardless of driver.


----------



## James Long

Tom Robertson said:


> An update about Nissan: https://www.yahoo.com/autos/on-nissans-road-to-self-driving-cars-a-speedy-223355115.html


"It's the non-verbal communications among drivers and pedestrians that's proving the toughest challenge for engineers. If you come to an odd three-way stop with a blinking red light, you can likely figure out quickly who's supposed to do what based on other vehicles, a trick that self-driving cars haven't mastered yet. The sensors in the Leaf that track moving objects rely on movement; a pedestrian who's standing still at a crossing may get overlooked by software."

Just one of the many driving situations faced daily. Programming the car to be timid means that the car doesn't move (unless overridden). Imagine your car stopped at a stop sign because of a too close to the road tree waving the the wind. Program the car to be aggressive and you hit someone. Blame it on the other person, but you still hit someone.


----------



## Tom Robertson

James Long said:


> Rule of thumb: If you have money, you can be sued. Some will sue even if you do not have money. Whether or not the suit is successful is a separate issue. But if you have any connection to the accident someone could sue you.
> 
> Umbrella coverage helps in those situations. Insurance to cover what other insurance does not cover (or when claims exceed coverage).
> 
> I would not count on not being liable for a stolen car. As noted above, if you have any connection to an accident someone could try to hold you responsible. Your insurance may not cover it ... but not being covered by insurance does not mean you are not liable.
> 
> Can an autonomous car be stolen? Especially one without a manual mode? What is the thief going to do, hold a gun to the car's processor and shout "drive"? (Or threaten the car with a large magnet or EMP?)
> 
> ...


[Trimmed the post only for brevity.]

All well said.

As we've been discussing liability one of the thoughts that occurred to me from the discussion was the liability for "being there." In situations where you've hired a taxi when the weather is less than ideal, as James points out, you can be sued for your part in the decision to be there even though you weren't driving. Granted, it isn't likely, yet that is what insurance does--covers us from catastrophic albeit unlikely events.

My hope is manufacturers take on the bulk of the liability costs, at least for autonomous mode. Partly because I think they can negotiate a better rate for all of us. Partly as a marketing effort. Partly as a reasonable thing to do in the transition while they are proving their technology. 

James, you do bring up another aspect from another angle I hadn't considered. How will this affect umbrella policies? Usually they are pretty inexpensive and I don't expect them to radically change, but a couple thoughts: they might be more common as insurers try to find products to sell and that initially they might be slightly more expensive for people who have autonomous cars.

Part of the autonomous car umbrella surcharge comes from how umbrella policies are loosely based on perceived pocket depth. People who own a boat bigger than a rowboat as perceived as better lawsuit targets. Or who have an RV. Or other toys. So an autonomous car, initially, might be a "deep pocket" signal as the first models are likely going to be from the more expensive lines.

Peace,
Tom


----------



## Tom Robertson

James Long said:


> "It's the non-verbal communications among drivers and pedestrians that's proving the toughest challenge for engineers. If you come to an odd three-way stop with a blinking red light, you can likely figure out quickly who's supposed to do what based on other vehicles, a trick that self-driving cars haven't mastered yet. The sensors in the Leaf that track moving objects rely on movement; a pedestrian who's standing still at a crossing may get overlooked by software."
> 
> Just one of the many driving situations faced daily. Programming the car to be timid means that the car doesn't move (unless overridden). Imagine your car stopped at a stop sign because of a too close to the road tree waving the the wind. Program the car to be aggressive and you hit someone. Blame it on the other person, but you still hit someone.


I'm glad you brought this up. I found several jewels in the article and didn't want to end up quoting the whole thing. 

Obviously one approach would be yet another set of sensors to distinguish trees from poles from people. Which only adds to the expense. 
Another would be more analysis--which might take more CPU and definitely more development time--though that is why Google has been working on this since 2009 and Nissan has been since at least 2013 (likely longer, as that's when they announced their timeline and had some testing underway.)

Peace,
Tom


----------



## Tom Robertson

Toyota announced a reversal in their stance: http://news.yahoo.com/toyota-set-artificial-intelligence-research-unit-u-041307394--finance.html



> "I used to say, quite until recently, that we will go ahead with automated drive only if they beat humans in a 24-hour car race," Toyota President Akio Toyoda told a news conference.
> "But I changed my mind after I got involved with planning of the 2020 Olympic and Paralympic Games (in Tokyo)," he said, explaining it opened his eyes to the need for cars for the disabled and elderly.


They are planning on $1B over 5 years to the project.

Peace,
Tom


----------



## James Long

Make this work autonomously ...
http://www.rcgroups.com/forums/showthread.php?t=2252792


----------



## Stewart Vernon

Just to be contrary... what would happen to Nascar?


----------



## James Long

Stewart Vernon said:


> Just to be contrary... what would happen to Nascar?


How about each autonomous car maker enter one car and see who wins? 
Win on Sunday - sell on Monday.


----------



## Drucifer

Stewart Vernon said:


> Just to be contrary... what would happen to Nascar?


I can see 'em making the pace vehicle autonomous.


----------



## phrelin

I just got around to reading this What it's like riding in a million-dollar autonomous Nissan Leaf which offers the best candid evaluation by a tech writer I've seen:



> Here's the ironic thing about today's autonomous car development programs: The latest prototypes actually require that the person behind the wheel concentrate more, not less.
> 
> That's because while self-driving vehicles like this Nissan Leaf Piloted Drive 1.0 prototype can do a remarkable job negotiating roads on their own most of the time, a drive of any length and complexity almost always carries with it the specter of an occasional flub or near miss. By contrast, were a human behind the wheel, most of these situations would have never escalated to the point where a need for a momentary swerve or panic braking resulted.
> 
> ...Even with the current state-of-the-art tech's momentary autonomous foibles, it's easy to see the promise such vehicles have for greatly decreasing accident rates and traffic congestion, not to mention for restoring autonomy to the world's elderly and infirm. Autonomous technology isn't just a game-changer for personal transportation, it's poised to usher in a whole new game.


Because I am who I am, I have to say that the computer on my desktop, the tablets I use, and the phone in my hand are all wonders for a guy who began his tech experience with a room-sized computer with a whopping 16k of RAM. Nonetheless, all of these 21st Century wonders not only occasionally have hardware failures but also glitch. I'll use "glitch" because as it is defined by Wikipedia it expresses what happens most accurately: "A glitch is a short-lived fault in a system. It is often used to describe a transient fault that corrects itself, and is therefore difficult to troubleshoot."

The problem is engineering does rely on the idea of "tolerance" well expressed as follows:



> A tolerance is the limit of acceptable unintended deviation from a nominal or theoretical dimension. Therefore, a pair of tolerances, upper and lower, defines a range within which an actual dimension may fall while still being acceptable.


We do attempt to define "tolerance" in terms of risk in human drivers. Those who are outside the "acceptable" lose their drivers license. But we know that every driver represents some level of risk to others on the road with them.

Now we're talking about engineering autonomous vehicles.What the articles are telling us about prototype autonomous car systems is that right now the level of risk is unacceptable but the "real human" driver is to be super-attentive and wrestle the vehicle out of harms way.

In the future the car is to be driving itself with no super-attentive human driver, maybe no human driver at all. There will be a risk, literally an engineered tolerance, X failures per million miles is acceptable. Those failures include expected Y deaths per million miles.

We avoid talking about the failures of the human driver licensing system in engineering-speak. We talk about it in legal-speak or simply by hand-wringing when tragedy happens. I can't get anyone to speak honestly about it. People will get angry and shout fearfully about child murderers as a risk to their kids even though numerically the number of child deaths annually from those killers is a minuscule fraction of the number of child deaths in automobile accidents each year.

So how many child deaths per million miles will be an allowable tolerance in the design of autonomous vehicles?


----------



## Tom Robertson

Stewart Vernon said:


> Just to be contrary... what would happen to Nascar?


Same thing that happened to the Kentucky Derby. 

Peace,
Tom


----------



## yosoyellobo

Interesting article Phrelin. I got a chuckle out the line "it change lane nearly eveytime".


----------



## Tom Robertson

phrelin said:


> I just got around to reading this What it's like riding in a million-dollar autonomous Nissan Leaf which offers the best candid evaluation by a tech writer I've seen:


It's interesting to read these articles for the biases everyone is bringing to the table. The writers, seemingly, projecting whether or not we'll ever be fully autonomous; the companies, most still thinking humans can be kept aware when the cars are either assisting with or actually performing the driving; and which problems are hard to solve. The other thing that is interesting is where each company is in the process. Google seems to be farthest along with testing as the problems identified in this article mirror some Google solved months ago.

Specifically about keeping humans aware, awake, and able to take over--why do the companies think they can get humans to stay aware with the car doing more of the driving when humans aren't aware now whilst doing all the driving?  Google saw a driver of a normal car using playing a trumpet!



phrelin said:


> ...We do attempt to define "tolerance" in terms of risk in human drivers. Those who are outside the "acceptable" lose their drivers license. But we know that every driver represents some level of risk to others on the road with them.
> 
> Now we're talking about engineering autonomous vehicles.What the articles are telling us about prototype autonomous car systems is that right now the level of risk is unacceptable but the "real human" driver is to be super-attentive and wrestle the vehicle out of harms way.
> 
> In the future the car is to be driving itself with no super-attentive human driver, maybe no human driver at all. There will be a risk, literally an engineered tolerance, X failures per million miles is acceptable. Those failures include expected Y deaths per million miles.
> 
> We avoid talking about the failures of the human driver licensing system in engineering-speak. We talk about it in legal-speak or simply by hand-wringing when tragedy happens. I can't get anyone to speak honestly about it. People will get angry and shout fearfully about child murderers as a risk to their kids even though numerically the number of child deaths annually from those killers is a minuscule fraction of the number of child deaths in automobile accidents each year.
> 
> So how many child deaths per million miles will be an allowable tolerance in the design of autonomous vehicles?


As you probably know, generally, there are metrics describing the risk and tolerance factors in most aspects of driving from tire failure to human failures (broken down into many categories), to weather, etc. We already have today's rate. So it seems to me the question is not necessarily how many child deaths per million miles but rather what level of improvement is necessarily before we'll implement?

Thinking out loud, coupling the two thoughts together, I want the measurement of improvement to include results from the whole solution. For instance, if it does require an aware human, the tests should be after the drivers have accepted and relaxed. As Google and Tesla have discovered, we humans tend to stop paying attention pretty quickly. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> Specifically about keeping humans aware, awake, and able to take over--why do the companies think they can get humans to stay aware with the car doing more of the driving when humans aren't aware now whilst doing all the driving?


Isn't focusing on the worst in human driving just as bad as focusing on the worst in autonomous driving?

The human record is 1.27 deaths per 100 million miles traveled or one death for each 6,200 drivers licenses. The accident rate was 185 per 100 million miles. (Figures from 2008 or 2009.) A really lousy record if you are or know one of the dead or were part of one of the accidents. But numbers that have been declining over the past few decades.

Distracted driving is a problem ... one (along with DUI) that will be "solved" by autonomous cars. But the "fall back to human" mode cars are introducing a new distraction - don't pay attention, the car will drive for you - until it doesn't. And in the examples given (such as the car straightening out in the middle of a turn) the moments that the human needs to take control come with little warning and plenty of risk.

Good news for the "autonomous" car proponents who can blame the human driver for not being alert or sober when the car needs human guidance at a moment's notice. Although I do appreciate recognition that keeping the driver engaged is a challenge. Autonomous cars are going to need to get past the problem of needing humans before they will be safe.


----------



## Drucifer

Anyone know how the Space Shuttle landed?


----------



## yosoyellobo

Drucifer said:


> Anyone know how the Space Shuttle landed?


Carefully.


----------



## phrelin

Drucifer said:


> Anyone know how the Space Shuttle landed?


Not by looking to see if the crossing guard was about to step out waving her sign....


----------



## phrelin

Tom Robertson said:


> As you probably know, generally, there are metrics describing the risk and tolerance factors in most aspects of driving from tire failure to human failures (broken down into many categories), to weather, etc. We already have today's rate. So it seems to me the question is not necessarily how many child deaths per million miles but rather what level of improvement is necessarily before we'll implement?


I agree but....

Tire failures are absent in our consideration of risks while strapping the kid into the car seat. But _other_ _drivers_ are not absent from our consideration.

After all we're going to have two cars driven by a myriad-of-electronic-1's-and-0's-inside-a-bunch-of-Chinese-manufactured-chips traveling towards each other at 60 mph with only a line painted on the asphalt separating them and no human being responsible. I'm not sure we will be able to make the transition from "I'm a responsible driver" and "he is an irresponsible driver" to "I'm not the responsible driver" and "the guy in the other car is not the responsible driver."

Of course, I've always said an alien race visiting right now would dismiss us as lacking intelligence once they saw the we choose to travel in cars hurtling towards each other at 60 mph with only a line painted on the asphalt separating them. :sure:


----------



## James Long

phrelin said:


> Of course, I've always said an alien race visiting right now would dismiss us as lacking intelligence once they saw the we choose to travel in cars hurtling towards each other at 60 mph with only a line painted on the asphalt separating them. :sure:


The AI team is working on that. I suspect the aliens would be impressed by a system where autonomous cars talked to each other and avoided each other (which will not happen until all cars are at least partially autonomous). But anything less than what they have on their home planet would probably not impress them. (And it is likely they have done better, since they also have managed to travel through space to visit our planet.

BTW: Promos for CSI:Cyber Sunday night show remote control cars being used.


----------



## Tom Robertson

James Long said:


> Isn't focusing on the worst in human driving just as bad as focusing on the worst in autonomous driving?


Potentially. There are definitely times when both should be discussed. In this example about distracted driving, some anecdotes seem reasonable. Besides--if the autonomous car made a similar mistake, they all can be fixed within minutes. Humans still will collectively make the same mistakes.



James Long said:


> Isn't focusing on the worst in human driving just as bad as focusing on the worst in autonomous driving?
> 
> The human record is 1.27 deaths per 100 million miles traveled or one death for each 6,200 drivers licenses. The accident rate was 185 per 100 million miles. (Figures from 2008 or 2009.) A really lousy record if you are or know one of the dead or were part of one of the accidents. But numbers that have been declining over the past few decades.
> 
> Distracted driving is a problem ... one (along with DUI) that will be "solved" by autonomous cars. But the "fall back to human" mode cars are introducing a new distraction - don't pay attention, the car will drive for you - until it doesn't. And in the examples given (such as the car straightening out in the middle of a turn) the moments that the human needs to take control come with little warning and plenty of risk.
> 
> Good news for the "autonomous" car proponents who can blame the human driver for not being alert or sober when the car needs human guidance at a moment's notice. Although I do appreciate recognition that keeping the driver engaged is a challenge. Autonomous cars are going to need to get past the problem of needing humans before they will be safe.


Completely agree that we need to get beyond needing humans as participants in driving.

Peace,
Tom


----------



## Stewart Vernon

Tom Robertson said:


> Same thing that happened to the Kentucky Derby.


Not quite apples to apples, though... People didn't replace horses with cars because cars were safer... Arguably riding a horse is safer than driving a car when you consider that a horse is a co-pilot. You can drive your car off a cliff, but you'd be hard pressed to ride a horse off a cliff! Horses also aren't known for having collisions with other horses and they don't go nearly as fast as a car. Horses were replaced by faster/stronger cars and trucks because other features that the horse couldn't provide were deemed to outweigh the dangers added.

So, what features would an automatic car provide over a human-driven car that would outweigh any new risks? And... would human drivers even be allowed? People are still allowed to ride horses, and some do it for fun... the same could be true for cars as well... but only if it wasn't outlawed. I could see a scenario where humans might become forbidden from driving cars due to "Safety concerns."

All that said... here's another gear (see what I did there?)

I don't think computers have to be perfect to replace humans. That's a common fallacy. I'm not even sure they have to be better to replace human drivers either. Humans aren't perfect, yet we get on the roads with millions of them every day! Hundreds and perhaps thousands in your local area even... so I'm not going to argue that the computer needs to be perfect. BUT the companies developing them sure seem to want to do that... or at least they want to argue they are better than humans. I think that's a moot argument, BUT if the companies want to argue that to support their vehicles, I don't see why they'd fight the liability part of it. I grant you that we'd still pay for it in the form of the cost of the car... but as long as I couldn't be sued for an accident, I'd take that concern off the table.

I don't feel comfortable right now with the idea of the computer car driving in certain conditions, as we have discussed like snow and rain and such... but to be fair, I don't just automatically trust other people either! There are people I wouldn't want to drive me around, and others that I'm okay with but not at night or in certain weather scenarios... so again, I'm not expecting perfection out of the computer.

I'll throw a bone out there too... "What if" you could have a learning mode for the computer where you'd be allowed to drive and the car would monitor your control and learn from that. Obviously you'd have to have some check/balance algorithm to account for people who disobey traffic laws and whatnot... but your car could be programmed to learn from the good things you do as a driver to make itself safer. IF they also made such a thing a networked feature where all the cars could learn from all the drivers and share that info, you could probably make some huge leaps in safety once you get a significant number of them on the road.


----------



## yosoyellobo

Could you drive a self driving car to water and make it drink it?


----------



## James Long

Stewart Vernon said:


> I could see a scenario where humans might become forbidden from driving cars due to "Safety concerns."


I hope not. So far horses have not been banned on roads despite "safety concerns". If horse drawn vehicles and farm implements are allowed on roads I do not see how human driven cars could be banned. Perhaps by type of road? There are freeways that have a minimum speed and minimum vehicle and restrictions on trucks to make travel safer. But I do not see a total ban.



Stewart Vernon said:


> Humans aren't perfect, yet we get on the roads with millions of them every day!


Humans are aware that they are not perfect. Is a computer? Fault detection needs to be built in to the point where the computer car knows that it did something wrong and then corrects for the error. Humans have trouble recognizing errors as well - some have more problems than others. But the design for operating a car is closer to developing an AI than a simple program to keep a vehicle between lines on marked pavement.



Stewart Vernon said:


> ... but as long as I couldn't be sued for an accident, I'd take that concern off the table.


I would not expect a complete waiver of liability for a car that is owned. A self driving cab or shared vehicle would be the responsibility of the owner and LESS the responsibility of the user. But as long as there are lawyers there will be someone claiming the owner and user is liable.


----------



## yosoyellobo

James Long said:


> I would not expect a complete waiver of liability for a car that is owned. A self driving cab or shared vehicle would be the responsibility of the owner and LESS the responsibility of the user. But as long as there are lawyers there will be someone claiming the owner and user is liable.


It really should not be any different than it is now. If I have a accident I notify the insurance company and let them worry about it. In this case I notify the car company insurance.

Ps I don't expect to pay a deductible.


----------



## Tom Robertson

Regarding liability in general, a couple thoughts: 1) I'm still responsible for my decisions; 2) 94% of accidents are assigned to the human, 2% for the vehicle, 2% for environment, and 2% for unknown in a study done by the NHTSA; 3) driver alcohol is involved in 36% of fatalities; 4) fatalities and accidents per 100M driven miles continue to go down. NHTSA report

So if I tell the car to drive in marginal conditions, I could be held liable for my decision. So I'd need to carry some insurance just to be safe. Or if I don't reasonably maintain the vehicle. But this is the small part of the liability pie.

If the manufacturers can eliminate a good chunk of human error in the driving, insurance should drop a lot. So the manufacturers, especially with their economy of scale, might find it very inexpensive to carry the liability for us. Wouldn't that be nice. 

Peace,
Tom


----------



## Drucifer

In a little over a week, 12 pedestrians have been killed in NYC alone. None were violating any traffic laws. Nor were the drivers under the influences.


----------



## peds48

Drucifer said:


> In a little over a week, 12 pedestrians have been killed in NYC alone. None were violating any traffic laws. Nor were the drivers under the influences.


so if no violations occured, how did those accidents happen?

Sent from my iPhone using Tapatalk


----------



## Drucifer

peds48 said:


> so if no violations occured, how did those accidents happen?
> 
> Sent from my iPhone using Tapatalk


One was a bus making a turn with driver claiming didn't see anyone. Same story the cab driver after working 15 out of 16 hours. Two others where cars losing control because the drivers had a health issue. And last, was senior hitting the gas instead of the brake. She plowed into a large group.


----------



## Stewart Vernon

Those all sound like they SHOULD have been moving violations... negligent driving, failure to yield to a pedestrian cover a lot of these scenarios.


----------



## yosoyellobo

Cop pull over Google self driving car, find no driver to ticket... http://www.cnn.com/2015/11/13/us/google-self-driving-car-pulled-over/index.html
Boy would I have love to be there.


----------



## James Long

24 in a 35 ... that is the kind of "timid" driving that I would expect from an autonomous car. 

It is a shame that the car did not get a ticket for impeding traffic. Despite Google's joke, I suspect that people have been pulled over for driving too slow.


----------



## Drucifer

Cops were probably on their donuts run and were upset with the speed of traffic


----------



## Stewart Vernon

Well... that's a driverless car with no passenger... but what if someone was riding in the back seat? Would that person have been written a ticket instead?


----------



## Drucifer

Yeah, driving more then ten miles per hour under the speed limit is a violation in most states. The vehicle should never allow itself to be driven in such a manner.


----------



## James Long

Stewart Vernon said:


> Well... that's a driverless car with no passenger... but what if someone was riding in the back seat? Would that person have been written a ticket instead?


There was a passenger in the car that was stopped. I believe all of the cars on public roads have "pilots" during the tests. The article mentions how the "passenger" explained the situation to the officer, who had questions on how the car choose a speed to drive.


----------



## Tom Robertson

Drucifer said:


> Yeah, driving more then ten miles per hour under the speed limit is a violation in most states. The vehicle should never allow itself to be driven in such a manner.


In California, this class of car is allowed to drive up to 25mph in zones with speed limits up to 35mph. It was totally legal. The officer was informing the safety drivers that if traffic backed up behind the car, it would need to pull over to let the traffic pass.

No ticket necessary, fully legal.

What I don't know is why the officer picked this car on this day to notice they go slower than 35. Google has regular meetings with the Mountain View police department, and these cars have been around Mountain view for quite some time. Did he not know about Google cars in his city?

Peace,
Tom


----------



## Tom Robertson

James Long said:


> There was a passenger in the car that was stopped. I believe all of the cars on public roads have "pilots" during the tests. The article mentions how the "passenger" explained the situation to the officer, who had questions on how the car choose a speed to drive.


You are correct. My understanding is the cars are required to have 2 safety drivers while operating under testing licenses in California. (And I think they use 2 drivers in Austin, Texas as well.)

Peace,
Tom


----------



## James Long

"This afternoon a Mountain View Police Department traffic officer noticed traffic backing up behind a slow moving car traveling in the eastbound #3 lane on El Camino Real, near Rengstorff Ave. The car was traveling at 24 mph in a 35 mph zone. As the officer approached the slow moving car he realized it was a Google Autonomous Vehicle. The officer stopped the car and made contact with the operators to learn more about how the car was choosing speeds along certain roadways and to educate the operators about impeding traffic per 22400(a) of the California Vehicle Code. The Google self-driving cars operate under the Neighborhood Electric Vehicle Definition per 385.5 of the California Vehicle Code and can only be operated on roadways with speed limits at or under 35 mph. In this case, it was lawful for the car to be traveling on the street as El Camino Real is rated at 35 mph.

The Mountain View Police Department meets regularly with Google to ensure that their vehicles operate safely in our community."

http://mountainviewpoliceblog.com/2015/11/12/inquiring-minds-want-to-know/


----------



## Tom Robertson

James Long said:


> 24 in a 35 ... that is the kind of "timid" driving that I would expect from an autonomous car.
> 
> It is a shame that the car did not get a ticket for impeding traffic. Despite Google's joke, I suspect that people have been pulled over for driving too slow.


This was a totally legal situation for this class of car, hence no tickets. Now, if there had been a row of traffic backed up behind the car, the safety driver(s) could have been ticketed if they did not let the traffic pass.

The cars are legally prohibited from driving faster than 25mph and may only drive in zones where the speed limit is 35mph or less.

Peace,
Tom


----------



## Tom Robertson

James Long said:


> The article noted that the officer didn't notice it was a Google car until he stopped it. I thought the cars were more obvious than that.


Actually the phrase was "as he approached the ... car". He realized before the stop. But decided to remind the safety drivers of the specifics of the law about traffic backing up behind a slower moving vehicle.

Peace,
Tom


----------



## James Long

The scene ...

https://www.google.com/maps/@37.3956878,-122.1015606,3a,75y,306.9h,87.66t/data=!3m7!1e1!3m5!1sXe-ErZQM7jsd_T3flioigw!2e0!6s%2F%2Fgeo0.ggpht.com%2Fcbk%3Fpanoid%3DXe-ErZQM7jsd_T3flioigw%26output%3Dthumbnail%26cb_client%3Dmaps_sv.tactile.gps%26thumb%3D2%26w%3D203%26h%3D100%26yaw%3D344.29419%26pitch%3D0!7i13312!8i6656

I'm not sure how the road is numbered ... which is the "#3 lane" as noted in the police report?

BTW: This car would not get me to work. There is no route where the speed limit stays under 35 MPH.


----------



## Drucifer

Tom Robertson said:


> *In California*, this class of car is allowed to drive up to 25mph in zones with speed limits up to 35mph. It was totally legal. The officer was informing the safety drivers that if traffic backed up behind the car, it would need to pull over to let the traffic pass.
> 
> No ticket necessary, fully legal.
> 
> What I don't know is why the officer picked this car on this day to notice they go slower than 35. Google has regular meetings with the Mountain View police department, and these cars have been around Mountain view for quite some time. Did he not know about Google cars in his city?
> 
> Peace,
> Tom


I stated 'most' and not 'all' as I know first hand you can drive as slow as you want on any Vermont state highway. My uncle stated it is the same for the interstate highway in Vermont, but I just can't believe that.


----------



## phrelin

Tom Robertson said:


> In California, this class of car is allowed to drive up to 25mph in zones with speed limits up to 35mph. It was totally legal. The officer was informing the safety drivers that if traffic backed up behind the car, it would need to pull over to let the traffic pass.
> 
> No ticket necessary, fully legal.
> 
> What I don't know is why the officer picked this car on this day to notice they go slower than 35. Google has regular meetings with the Mountain View police department, and these cars have been around Mountain view for quite some time. Did he not know about Google cars in his city?
> 
> Peace,
> Tom


The officer did what he was supposed to do. He noticed traffic backed up behind the vehicle. Under California law if you have more than five vehicles backed up behind you, you must pull over and let them pass. All that has to happen is that somebody needs to program a routine that counts the vehicles behind the car and directs it to pull over as soon as it is safe. It also needs to tell the car to speed up if it can't pull over. At this point Mountain View may be tolerating things a little too much IMHO. The car should know these kind of rules by now.


----------



## phrelin

By the way, another article on the same subject at Information Week headlined *Daimler Tests Self-Driving Truck On German Autobahn* confirms my suspicions:



> The company pointed out that autonomous driving has considerable advantages, especially in the road freight transport sector.
> 
> Among the many benefits is safety, especially for long-haul trucking. With the Highway Pilot system always active, the system never suffers fatigue or becomes distracted. A study by Daimler Trucks indicated that driver fatigue decreases by 25% if the operators are relieved of monotonous lane-keeping.
> 
> In addition to safety, autonomous driving can also free up the driver to do other tasks, such as completing documents on a tablet while on the road, which improves productivity, according to the company. The truck can also burn less fuel thanks to optimum gear-shifting, acceleration, and braking.


IMHO it is in trucking that the significant benefits and high profits from these driving enhancements will be found.


----------



## inkahauts

In California if conditions are safe then the minimum speed is 45mph on a freeway. There is no minimum on non freeways that I have ever seen. Just that you shouldn't hold up traffic and should pull over for faster drivers. But I believe you can drive 10 mph down a road with a 55 mph speed limit if you want. 

And I believe that's the case in part because of our farming and tractors on the road etc.

Edit:

after further reading the 45mph is only if actually posted now after a traffic engineering survey. Otherwise it's simply you can't impede other drives driving properly and within the speed limit.


----------



## James Long

phrelin said:


> By the way, another article on the same subject at Information Week headlined *Daimler Tests Self-Driving Truck On German Autobahn* confirms my suspicions:
> 
> IMHO it is in trucking that the significant benefits and high profits from these driving enhancements will be found.


I agree. An "advanced cruise control" is a good use of the technology - once it gets up to speed.


----------



## James Long

BTW: Here are the rules that the Google Car was operating under ---

Neighborhood Electric Vehicle (NEV)/Low-Speed Vehicle (LSV) and Golf Cart Registration (FFVR 37)
https://www.dmv.ca.gov/portal/dmv/?1dmy&urile=wcmath:/dmv_content_en/dmv/pubs/brochures/fast_facts/ffvr37


----------



## phrelin

The New York Times piece *The Dream Life of Driverless Cars* offers an interesting take on the complexities of the type of 3-D scanning used in autonomous cars. It's a good read.


----------



## James Long

I noticed on "The Good Wife" last night that one of the stories was about an Autonomous car with a test pilot. The car was in an accident where the test pilot claimed the car caused the accident - there was some discussion of AI and what decisions the car could make.

Spoiler Alert: The car was hacked by the pilot's friends.


----------



## Stewart Vernon

It was a timely episode for this conversation! They also took into account other things we have talked about here... like the car needing to drive to learn, experience needed in weather conditions, ability for human driver to override the computer in some cases, and attempts to program the car to be "more human" by making it obey less traffic laws in some cases. They brought up as an example the 4-way stop... where the computer car would keep waiting indefinitely because the rules say all cars must come to a complete stop at the sign before yielding right of way... and since so many human drivers never technically stopped fully, the computer car would just sit there forever... so they programmed in rolling "stops" as an example to try and make it more compatible with human driver interaction.


----------



## Tom Robertson

Shoe's take on the subject: http://www.gocomics.com/shoe/2015/11/23

Peace,
Tom


----------



## Stewart Vernon

I'm with Shoe... Roomba technology ought to be perfected before the self-driving car!


----------



## phrelin

The Atlantic offered up this article earlier this week *The High-Stakes Race to Rid the World of Human Drivers* which begins with these paragraphs...



> The race to bring driverless cars to the masses is only just beginning, but already it is a fight for the ages. The competition is fierce, secretive, and elite. It pits Apple against Google against Tesla against Uber: all titans of Silicon Valley, in many ways as enigmatic as they are revered.
> 
> As these technology giants zero in on the car industry, global automakers are being forced to dramatically rethink what it means to build a vehicle for the first time in a century. Aspects of this race evoke several pivotal moments in technological history: the construction of railroads, the dawn of electric light, the birth of the automobile, the beginning of aviation. There's no precedent for what engineers are trying to build now, and no single blueprint for how to build it.
> 
> Self-driving cars promise to create a new kind of leisure, offering passengers additional time for reading books, writing email, knitting, practicing an instrument, cracking open a beer, taking a catnap, and any number of other diversions. People who are unable to drive themselves could experience a new kind of independence. And self-driving cars could re-contextualize land-use on massive scales. In this imagined mobility utopia, drone trucks would haul packages across the country and no human would have to circle a city block in search of a parking spot.


...and goes on the provide a surprisingly thorough overview and status report.


----------



## phrelin

In an *Associated Press story* appearing in hundreds of newspaper websites, California's rules for "self-driving cars" have been released and do require a driver behind the wheel to take over. But the more interesting element has to do with initial permitting of these cars and licensing of their driver-owners (note that they don't call them "driverless" or "autonomous"







):



> The draft sets out the framework for how the state's Department of Motor Vehicles wants to move from the current small-scale testing of prototypes on roads and highways to giving consumers access to the fast-evolving technology. The DMV can change the rules over the coming months before they are finalized, and the industry is likely to contest them as overly burdensome.
> 
> Under the draft rules, even if Google thinks its car is ready for sale, that wouldn't be immediately possible. Initially, manufacturers would receive a permit for three years, during which time consumers could lease the cars but manufacturers would be required to keep tabs on how safely they are driving and report that performance to the state.
> 
> Before granting that initial permit, both the manufacturer and an independent certifier would need to sign off that the car has passed safety testing. Any person who wants to lease or use one of the cars would need special training provided by the manufacturer, and then receive a special certification on their driver's license.


It seems like an adequately conservative approach, restraining both manufacturers and enthusiasts.


----------



## yosoyellobo

If they had been around in the early days of the automobile Henry Ford would still be waiting for approval.


----------



## dennisj00

yosoyellobo said:


> If they had been around in the early days of the automobile Henry Ford would still be waiting for approval.


I've always said if the internal combustion engine was developed today, we wouldn't have it. Can you imagine carrying 20 gallons of gasoline around in a car? !!


----------



## James Long

phrelin said:


> It seems like an adequately conservative approach, restraining both manufacturers and enthusiasts.


The challenge is staying ahead of the technology. Car manufacturers like Tesla could introduce "auto drive" one step at a time but at what point would they cross a line where compliance with the new law would be needed? It is easier with Google cars that are purpose built to be "auto drive". The cars will have to pass government inspection just like any new vehicle introduced. But how does the law handle a car that got a feature added by the manufacturer after it was purchased?


----------



## Tom Robertson

phrelin said:


> In an *Associated Press story* appearing in hundreds of newspaper websites, California's rules for "self-driving cars" have been released and do require a driver behind the wheel to take over. But the more interesting element has to do with initial permitting of these cars and licensing of their driver-owners (note that they don't call them "driverless" or "autonomous"
> 
> 
> 
> 
> 
> 
> 
> ):
> 
> It seems like an adequately conservative approach, restraining both manufacturers and enthusiasts.


Based on the real world driver experiences of Google and Tesla, the California requirements provide fake safety while actually potentially creating more danger.

By requiring human interaction in the requirements, manufacturers will not be required to be as safety conscious. They can pass that off onto the human. But the reality is the humans are not able to maintain full awareness when the autopilot is doing most of the work. Heck, we already fall asleep with all the work we have to do now. 

I can understand Google's disappointment. I am too.

While I can appreciate requiring a steering wheel and throttle in initial models as I do expect some manual mode driving from time to time, to require a human stay alert feels like walking down the center of the highway. As Mr. Miyagi says: "Walk left side, safe. Walk right side, safe. Walk middle, sooner or later [makes squish gesture] get squish just like grape."

Hopefully the public meetings will sort out these kinds of things.

The California DMV website: https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/auto

Peace,
Tom


----------



## James Long

yosoyellobo said:


> If they had been around in the early days of the automobile Henry Ford would still be waiting for approval.


I was not around to witness it, but there was an approval process for horseless carriages. Fortunately the things were slow and adoption was not immediate. They eased their way into the marketplace as people got used to their presence.

Google's 25 MPH "golf cart" licensed vehicles are easing automation's way into use. Henry Ford wasn't building vehicles that would hurtle themselves at each other at 55+ MPH separated only by a painted line (or less).



dennisj00 said:


> I've always said if the internal combustion engine was developed today, we wouldn't have it. Can you imagine carrying 20 gallons of gasoline around in a car? !!


Nope. The Model T had an eight or ten gallon tank. One can get a 16 gallon tank for a Model T - but carrying around 20 gallons of gas wasn't common (unless one had extra gas cans strapped on the car).


----------



## Rickt1962

To hell with self driving cars ! I want to see Motor homes ! Now that would be a great trip !


----------



## dennisj00

James Long said:


> Nope. The Model T had an eight or ten gallon tank. One can get a 16 gallon tank for a Model T - but carrying around 20 gallons of gas wasn't common (unless one had extra gas cans strapped on the car).


You certainly took me too literally. I was just relating to today's acceptance (or lack of) of hazardous / flammable materials - not counting the pollution and inefficient aspects.

And rightly so, leaded gas almost killed all of us.

A few years ago, there were alarming articles about the hazard of batteries in EVs in a wreck. Those fears haven't materialized.


----------



## James Long

Electric vehicles are still waiting for their Pinto moment. Hopefully it will never home and all will be bliss.


----------



## Drucifer

__ https://twitter.com/i/web/status/678320188471799809


----------



## Drucifer

__ https://twitter.com/i/web/status/677996862020960256


----------



## James Long

From the Bloomberg Business article ...
*The self-driving car, that cutting-edge creation that's supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.

The glitch?

They obey the law all the time, as in, without exception.*
Not the car's fault, of course. But double the accident rate is not good for business.

*"We end up being cautious," Rajkumar said. "We don't want to get into an accident because that would be front-page news. People expect more of autonomous cars."*


----------



## phrelin

Yeah, the amazing thing is California's DMV published their proposed rules requiring a 3 year certification period and a driver and you would think they proposed to shut down Google. The politics of this are going to shift into the new norm of politics with name calling. Meanwhile our next-Governor-most-likely already joined in the criticism.

The problem here is Google isn't even producing a car yet. On the other hand ordinary car companies can add "self-driving" features relatively easily. And then there's Tesla which released a beta version on the road. :nono2:


----------



## Drucifer

Yep, the self-driving vehicle needs huge rear-facing flashing signs.

One, 'I am computer driven, so I stop for pedestrians' and two, 'Vehicle has rearview camera. It report tailgaters automatically'.


----------



## phrelin

And so *Google Pairs With Ford To Build Self-Driving Cars*. The article tells us:



> Google and Ford will create a joint venture to build self-driving vehicles with Google's technology, a huge step by both companies toward a new business of automated ride sharing, Yahoo Autos has learned.
> 
> According to three sources familiar with the plans, the partnership is set to be announced by Ford at the Consumer Electronics Show in January. By pairing with Google, Ford gets a massive boost in self-driving software development; while the automaker has been experimenting with its own systems for years, it only revealed plans this month to begin testing on public streets in California. Google has 53 test vehicles on the road in California and Texas, with 1.3 million miles logged in autonomous driving.
> 
> By pairing with Ford, the search-engine giant avoids spending billions of dollars and several years that building its own automotive manufacturing expertise would require. Earlier this year, Google co-founder Sergey Brin said the company was looking for manufacturing partners that would use the company's self-driving system, which it believes could someday eliminate the roughly 33,000 annual deaths on U.S. roads.


That article is from Yahoo Autos. Given the fact that Yahoo and Google headquarters are about 10 minute apart, I suspect the information is accurate. And, of course, Yahoo CEO Marissa Mayer is a former Google VP.

Automotive News also reported the following:



> Ford spokesman Alan Hall neither confirmed nor denied a possible deal."We work with a lot of tech companies all over the world. We keep these discussions private for obvious competitive reasons and we do not comment on speculation," he said.
> 
> Google has added two veteran Ford executives to its leadership team. Former Ford CEO Alan Mulally joined Google's board of directors eight days after he retired from the automaker on July 1, 2014. Then in September, Google hired John Krafcik as CEO of the company's Self-Driving Car Project. Krafcik, who most recently was president of TrueCar Inc., was CEO of Hyundai Motor America. He spent 14 years at Ford, including a stint as chief engineer during the development of the Ford Expedition SUV.


I would think they could get a production model prototype out there soon to start the three year clock running. But a cynic like me can now speculate why Google was so upset - the red tape announcement of DMV interfered with the timing of the coming announcement at CES, perhaps putting a damper on the potential stock price boost.


----------



## Tom Robertson

phrelin said:


> I would think they could get a production model prototype out there soon to start the three year clock running. But a cynic like me can now speculate why Google was so upset - the red tape announcement of DMV interfered with the timing of the coming announcement at CES, perhaps putting a damper on the potential stock price boost.


It sounds like you might be misinterpreting the 3 year permit. This is not a 3 year test before deployment, it is a permit to deploy a model for 3 years after the testing has been finished and the car has been certified by the 3rd party testing, and accepted by the DMV.

And, at any time, the state could change the rules. In fact, I expect they will have to--as there is no current provision for extending the permit to deploy. 

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> It sounds like you might be misinterpreting the 3 year permit. This is not a 3 year test before deployment, it is a permit to deploy a model for 3 years after the testing has been finished and the car has been certified by the 3rd party testing, and accepted by the DMV.
> 
> And, at any time, the state could change the rules. In fact, I expect they will have to--as there is no current provision for extending the permit to deploy.
> 
> Peace,
> Tom


No I meant a production model prototype. My guess is that Google can get the 3rd party testing done with DMV acceptance right now if they can get the "GooFord" ready. I could see a car in production by 2017 for the three year permit. :righton:


----------



## Tom Robertson

phrelin said:


> No I meant a production model prototype. My guess is that Google can get the 3rd party testing done with DMV acceptance right now if they can get the "GooFord" ready. I could see a car in production by 2017 for the three year permit. :righton:


 Good to see. 

Peace,
Tom


----------



## phrelin

This Fortune Magazine article *Why Google and Ford Would Want to Team Up on Self-Driving Cars* provides more insight into what's going on behind the scenes and this:



> Ford could also use a Google, self-driving car, partnership to work on launching mobility and ride sharing tech. Ford has been working on this type of technology for awhile including at its Silicon Valley lab.
> 
> Ford and Google have also appear to have similar visions for the future of self-driving cars. Both Google and Ford plan to move much faster toward fully-autonomous cars, while other automakers are interested in a step-by-step approach focusing more on using computing to assist drivers.


Ford has been conservative. But they avoided having to take a government bailout back when because of that conservative approach. Partnering with apparent technological success seems smarter than trying to compete directly when the technology comes from a company with no automotive manufacturing experience.


----------



## phrelin

SliconBeat, the *tech blog of the Silicon Valley Mercury News* has another view but makes this observation:



> If the innovation coming from these and other companies continues apace, autonomous vehicles could account for 10 percent of global vehicle sales by 2035, according to a report by the Boston Consulting Group. The same report said that 44 percent of U.S. consumers polled said they'd consider buying a fully autonomous vehicle within 10 years.
> 
> And if, instead, you're someone who's fretting over sharing the roads with robotic cars, take comfort in this little peek into the future: Earlier this year, Google co-founder Sergey Brin called solving highway gridlock and cutting down on automobile accidents two of the "big challenges of our time," he added that humans will stay involved to ensure safe driving.


----------



## Tom Robertson

Another article from Fortune: Elon Musk Says Tesla Vehicles Will Drive Themselves in Two Years

Peace,
Tom


----------



## dennisj00

Another interesting read branched from Tom's link above. . . http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/

Merry Christmas everyone!


----------



## phrelin

In a piece from *CityLab Why Aren't Urban Planners Ready for Driverless Cars*? the discussion of the implications are interesting:



> The biggest factor, then, is not uncertainty about whether or not self-driving cars will change urban transportation. Rather, it's uncertainty over just what those changes will look like, and how these shifts will impact major planning investments already underway. One planner put it bluntly: "We don't know what the hell to do about it. It's like pondering the imponderable."
> 
> Fair enough. No one knows for sure what types of social changes will come with driverless cars, and the possible outcomes can vary dramatically. On one hand, if people buy their own autonomous vehicles, they might also choose to live farther away from work, knowing their commute will be less stressful and likely more productive. On the other hand, if people partake in shared networks of robotaxis-buying mobility by the drink instead of the bottle, as Princeton's Alain Kornhauser puts it-they might double down on the convenience of central city life.
> 
> But even the MPOs interviewed by Guerra recognize that too much hesitation over imponderables becomes its own sort of planning decision. Take a basic highway expansion plan that's in the works. Local officials might go through with the project, only to discover that the extra lanes are unnecessary in an age of driverless cars, which can safely operate closer together and thus serve as a de facto road expansion by themselves. There's only so much road money to go around: using it for expansion instead of maintenance can be a big mistake.


----------



## inkahauts

I call bs on the no need to expand roads comment because of driverless cars. Almost every road can be expanded without it ever being a waste for a large host of reasons not the least of which is I doubt we will ever see a population drop in this country. There will always be more and more cars and it'd take 30 years plus to move the vast majority to driverless cars anyway and most our roads aren't big enough for the next 30 years as it is.


----------



## Drucifer

phrelin said:


> In a piece from *CityLab Why Aren't Urban Planners Ready for Driverless Cars*? the discussion of the implications are interesting:


Planners never consider future tech in their plans. As they treat all future tech, even ones in development, as being sci-fi.


----------



## Tom Robertson

Drucifer said:


> Planners never consider future tech in their plans. As they treat all future tech, even ones in development, as being sci-fi.


Sounds like they are taking future tech very seriously. Extra meetings specifically on this.

The issue is no one knows what will happen. Will driverless cabs replace privately owned cars? Will driverless minibuses replace major buses reducing cars? They expect ridership to go up but until the market decides what shape this will take, they don't know if roadway surface will need to increase or potentially decrease. And which roadways? Local? Highway? Will driverless cars be able to reduce the spacing between vehicles, allowing more cars in the same space?

Merry Christmas!
Tom


----------



## Tom Robertson

inkahauts said:


> I call bs on the no need to expand roads comment because of driverless cars. Almost every road can be expanded without it ever being a waste for a large host of reasons not the least of which is I doubt we will ever see a population drop in this country. There will always be more and more cars and it'd take 30 years plus to move the vast majority to driverless cars anyway and most our roads aren't big enough for the next 30 years as it is.


The article talks about these plans must go out 20 years. In the next ten years, I don't see autonomous cars changing the growth pattern, I don't think enough of them will be on the roads to actually make major difference. In 20 years, I think we'll have a good sense of the markets direction. So I'm guessing the spending for the next 15 to 20 years should plan to be roughly the same.

After that, I understand their predicament. They don't want to spend money on more road surface that also requires spending money on maintenance if that extra space won't be needed for 40 years instead of 20 years from now. That extra money is compounded by the extra maintenance costs.

Merry Christmas!
Tom


----------



## yosoyellobo

Tom Robertson said:


> Sounds like they are taking future tech very seriously. Extra meetings specifically on this.
> 
> The issue is no one knows what will happen. Will driverless cabs replace privately owned cars? Will driverless minibuses replace major buses reducing cars? They expect ridership to go up but until the market decides what shape this will take, they don't know if roadway surface will need to increase or potentially decrease. And which roadways? Local? Highway? Will driverless cars be able to reduce the spacing between vehicles, allowing more cars in the same space?
> 
> Merry Christmas!
> Tom


I don't think that I would feel comfortable in a driverless car that did not maintain the standard one car length for every 10 mph.


----------



## Tom Robertson

yosoyellobo said:


> I don't think that I would feel comfortable in a driverless car that did not maintain the standard one car length for every 10 mph.


If (or perhaps when) the cars are all driverless _and_ interconnected, sharing information, we'll likely be comfortable with a much smaller space around individual cars. Groups of cars could form with spacing around the group being the safety factor instead.

Yet that is part of the problem the planners are facing through this period of transition. Some of these savings will take 20,30, 40 years to have a major affect on the planning. Critical mass of whatever the new market will look takes time.

Merry Christmas!
Tom


----------



## inkahauts

If every car was truly automated and talking to each other they could be as close together as a trailer connected to a car and be fine. That's where the quantity and time savings of driverless cars would really come from. 

I suspect we will see auto drive car lanes on freeways like we do carpool someday when there's se pugs cars to warrant it.


----------



## Drucifer

I can see the interstates having all sorts of new rules as self-driving trucks begin to out number all other vehicles on them.


----------



## Drucifer

__ https://twitter.com/i/web/status/681886155562848258


----------



## dennisj00

'Charlie Rose' had Sebastian Thrun (ex Google-X CEO, Ted Talks, director of AI Lab at Stanford) as his guest last night.

He surmised the development of robotic cars as very positive and I'm paraphrasing his comment, "When a driver makes a mistake, he learns from that mistake. When a robotic car makes a mistake, ALL robotic cars - and EVERY FUTURE robotic car learns from that mistake.

Catch it quickly online if you can.


----------



## Drucifer

dennisj00 said:


> 'Charlie Rose' had Sebastian Thrun (ex Google-X CEO, Ted Talks, director of AI Lab at Stanford) as his guest last night.
> 
> He surmised the development of robotic cars as very positive and I'm paraphrasing his comment, "When a driver makes a mistake, he learns from that mistake. When a robotic car makes a mistake, ALL robotic cars - and EVERY FUTURE robotic car learns from that mistake.
> 
> Catch it quickly online if you can.


I really don't give a damn about self-driven automobiles. What I want to see automated first are all those long distant trucks.

But I do like an electronically enhanced car that help the driver see everything around them without losing focus on what is ahead in the road.


----------



## phrelin

One subject I haven't written about in this Forum, but hope to, is the so-called "Smart Cities Initiative" aka "Smart Cities". Because "*AT&T Just Announced a Huge Smart Cities Initiative*", I thought the subject as it applies to self-driving cars ought to be mentioned here.

Among a series of articles The Road Ahead: Connected Cars & The Future is one article - The United States Has A Plan To Become A Nation Of Smart Cities - that begins:



> Imagine a city where there is no traffic gridlock, driverless taxis are the most efficient way to travel and traffic signals are sequenced to respond to individual cars rather than some random pattern. A city where data, technology and innovation all exist in perfect harmony to provide an infrastructure that works.
> 
> If this sounds like a utopian dream, the United States Department of Transportation wants to make it a reality.
> 
> As part of its ongoing plans to upgrade American infrastructure, the DOT has launched a Smart City Challenge. The agency is offering mid-sized American cities-those with a population of between 200,000 and 850,000 residents-the opportunity to submit proposals for their visions of the future. The DOT will then award the winning city with $50 million of funding to implement these ideas and create a model for other cities to follow.


I've read about the "Smart Cities Initiative" over the past few years which has stirred my inherent paranoia. But I have to admit that the possibilities for self-driving vehicles within urban areas supported by traffic/street management devices that interact with those vehicles do seem endless and practical. Within that framework, it does seem that self-driving vehicles on freeways and within cities aren't as improbable as it sounded a few years ago.


----------



## Tom Robertson

Drucifer said:


> I really don't give a damn about self-driven automobiles. What I want to see automated first are all those long distant trucks.
> 
> But I do like an electronically enhanced car that help the driver see everything around them without losing focus on what is ahead in the road.


Enhanced visibility would be a great safety feature and could be something added as an interim feature.

Self-parking valet could be too.

Peace,
Tom


----------



## phrelin

According to this article *Audi of America chooses Northern California venue for self-driving car testing*:



> Audi said engineers will study data amassed at Thunderhill Raceway Park, just outside Willows in Glenn County. The three-mile, 15-turn road course is a popular site for special automotive events, kart races, vintage auto races, corporate events and trade shows.
> 
> The American arm of the German automaker said its testing will be coordinated with its own engineers and those with the Electronics Research Laboratory in Belmont in San Mateo County. The ERL is a global research and development network that supports the Volkswagen Group brands, including Audi, Bentley, Bugatti, Lamborghini and VW.
> 
> Audi hopes to introduce its automated driving technology - which it calls "piloted driving" - with the next-generation Audi.


Ah yes, the Volkswagon folks who when their cars could not meet emission standards cleverly modified onboard computers to fool the testers. :sure:


----------



## phrelin

Meanwhile from _Road & Track_ *Volvo S90 Will Be the First Car With Standard Semi-Autonomous Technology*:



> Volvo's handsome S90 sedan will enter into a packed and high-tech market when it goes on sale in the U.S. toward the end of 2016. But it'll pack one feature nobody else offers in the U.S. market: self-driving technology as a standard feature.
> 
> The S90 will utilize the second generation of Pilot Assist, Volvo's semi-autonomous driving tech that was first featured in the XC90 SUV. In the SUV, the feature tracks a vehicle driving in front of you to know when to accelerate, brake, and steer, at speeds up to 30 mph, where lane markings are clearly visible to the system's cameras.
> 
> The second-gen Pilot Assist, standard in the S90 sedan, will be able to accomplish those same tasks without needing to follow a car in front, according to Volvo. It will also do these functions at speeds up to 80 mph.


----------



## phrelin

And with CES we can't leave out *Microsoft Partnerships Drive Connected Cars: CES 2016* which tells us (*emphasis* added):



> Microsoft is deepening its foray into connected cars, as indicated by updates on its partnerships with Volvo, Nissan, Harman, and IAV. Announcements came from the 2016 Consumer Electronics Show (CES) taking place this week in Las Vegas.
> 
> "In the near future, the car will be connected to the Internet, *as well as to other cars*, your mobile phone and your home computer," Microsoft executive vice president for business development Peggy Johnson said in a blog post. "The car becomes a companion and an assistant to your digital life. And so our strategy is to be the ultimate platform for all intelligent cars."


Now you may ask what Harman has to do with this? Get ready to be truly depressed:



> Without compromising safety, mobile workers will be able to hear and respond to emails, schedule meetings, join conference calls without manually inputting phone numbers, and manage tasks throughout the day. Drivers will eventually be able to hold Skype calls while in park or while driving an autonomous vehicle.
> 
> Harman notes Office 365 will be continuously updated through Harman's over-the-air incremental updates.
> .
> Volvo is planning to enable users to communicate with cars via Microsoft Band 2. New concepts will integrate the latest Band with a Windows 10 smartphone and the Volvo On Call Universal app.


And so those who were thinking they could take a brief respite from work while riding in their self-driving car ...well... the car itself will become one big Office 365 and you'll be able to attend a meeting on Skype while riding in your own car from one off-site meeting to another.

I know I'm paranoid, but there is something about all this that reminds me of this:

[youtubehd]2zfqw8nhUwA[/youtubehd]


----------



## inkahauts

phrelin said:


> According to this article *Audi of America chooses Northern California venue for self-driving car testing*:
> 
> Ah yes, the Volkswagon folks who when their cars could not meet emission standards cleverly modified onboard computers to fool the testers. :sure:


Well at least we know they are pretty good programmers I guess.


----------



## James Long

inkahauts said:


> Well at least we know they are pretty good programmers I guess.


Finally, an autonomous car programmer who is willing to cheat on the rules.

Perhaps their cars will lead to less accidents due to being an overly cautious operator.


----------



## Drucifer

__ https://twitter.com/i/web/status/685099797490089984


----------



## phrelin

Drucifer said:


>


I just loved that one in this weeks _New Yorker_ particularly since it seems "they" are trying to get cars to "talk to each other."


----------



## dennisj00

"Science Friday" had a good discussion of CES and particularly autonomous cars. You NPR station may have it again on Saturday or catch it online.


----------



## phrelin

And now we have this from the Washington Post *Road predictions for 2050: The end of gasoline, traffic deaths and gear heads* which is interesting albeit somewhat amusing to a guy who visited the World of Century 21 exhibit at the 1962 Seattle World's Fair known as the Century 21 Exposition. Now where the heck did I park my gyrocopter.... :grin:

We will never see the end of traffic deaths and gear heads. Still, predictions notwithstanding, it looks realistic to assume new gasoline powered autos will stop being used for most driving in the next 35 years if someone invents an environmentally safe battery system and figures out how to generate five times the electricity in an environmentally safe manner. We aren't there yet.


----------



## Drucifer

It will be the trucking industry that will lead the automation of highways. Why? It is cheaper. And making money always leads the way.


----------



## Tom Robertson

New study finds driverless cars actually have fewer accidents: http://news.yahoo.com/crash-rates-self-driving-cars-less-conventional-cars-192159862--finance.html

The previous study was based on accident rates that are not calculated using the same criteria between driver and driverless cars. Driver rates were based on voluntary reports, whereas driverless accident reports are required. Ok, this starts to make more sense. While I generally question studies at the behest of interested parties, at least this one has some sanity in the thought process.

Many (most?) states do not require bumps and fender benders to be reported until it reaches a financial threshold or an injury occurs. Even those will go unreported to avoid the insurance hassle.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> New study finds driverless cars actually have fewer accidents: http://news.yahoo.com/crash-rates-self-driving-cars-less-conventional-cars-192159862--finance.html


The new study estimates driverless cars to have fewer accidents.

Adding estimates of how many accidents are not reported helps soften the numbers and explain the "apples to oranges" of trying to compare a pool of vehicles where all accidents are required to be reported against a pool where there are minimum requirements for reporting (and, quite frankly, even required to be reported accidents go unreported). But it is an opinion that such an adjustment in counting makes the cars safer.

More miles need to be logged to make such a determination.


----------



## yosoyellobo

The bottom line is that the manufactures will not put out a car that is not magnitude safer then what is now on the roads.


----------



## Tom Robertson

James Long said:


> The new study estimates driverless cars to have fewer accidents.


This statement feels disingenuous. The study doesn't "estimate" driverless cars have fewer accidents. The study attempts to determine the correct rates of accidents and use the apples to apples rates to reach a conclusion. The semantic difference is perhaps slight, yet that is what leads to feeling of disingenuous. The key comes down to methodology of determining the actual accident rates. Without access to the study materials, I don't know if they were estimated, or more accurrately determined, or a mixture of estimation and determination. (I tend toward thinking it was a mixture, though I don't therefore know how much was estimated vs. how much was determined.)



James Long said:


> Adding estimates of how many accidents are not reported helps soften the numbers and explain the "apples to oranges" of trying to compare a pool of vehicles where all accidents are required to be reported against a pool where there are minimum requirements for reporting (and, quite frankly, even required to be reported accidents go unreported). But it is an opinion that such an adjustment in counting makes the cars safer.


While the article doesn't specifically say safer, there are a couple important notes: all levels of accident severity were less frequent in driverless cars than driver cars and that the National Transportation authority calculated non-reporting of accidents at 60%. I do not know if this study used the NTA numbers or not.

So if it is an option, at least it is one based upon factual data. 



James Long said:


> More miles need to be logged to make such a determination.


That feels like an opinion.  I'm curious at what point statisticians would consider the numbers significant. I presume 10,000 miles ain't enough.  I hope a trillion miles isn't required.  (No, I don't think anyone is asking for that level, simply setting a boundary were we all might agree.) Unfortunately, while I think 1 million miles is starting to be significantly representative, I really don't know.

The good news, more miles are coming every week.

As a sub-discussion: would miles be part of a certification process? I'm wondering if a car to be certified must be required to completely mileage minimums in several testing categories.

Peace,
Tom


----------



## James Long

yosoyellobo said:


> The bottom line is that the manufactures will not put out a car that is not magnitude safer then what is now on the roads.


It sounds like Google does not want to guarantee the safety of their vehicles unless nobody other than the vehicle is allowed to drive. The opposition to manual controls where a human might take over, cause an accident and blame it on the car are beyond what they want to deal with.

As Tom correctly pointed out in previous posts, the activation of a driverless feature will lead to LESS attention to the road. If the car is driving the driver becomes another passenger doing whatever passengers do in cars. Reading or texting or emailing or sleeping. Whatever the law will allow. And even though the law (as California proposes) may require an alert driver ready to step in at a moment's notice, people will become complacent about the operation of the vehicle ... and when (not if) the car needs help with driving the driver will be less ready to operate the vehicle. Over time people will lose driving hours and have less experience behind the wheel. When there is a problem one may wish for more experience behind the wheel ... that is not the way the law is going.

Best case scenario for the car manufacturers is to throw out the steering wheel. Not allow the passengers to drive. Take full control of the vehicle and not expect a driver to be ready or able to interrupt the computer's more intelligent and more experienced (in Google's opinion) thought process. But with that control comes responsibility ... so best case scenario also comes with caveats to take the manufacturer off the hook for any accidents the car does cause.

It seems that the manufactures want the best of both worlds.


----------



## James Long

Tom Robertson said:


> So if it is an option, at least it is one based upon factual data.


It is an opinion, no joke. The latest study is giving the writer's opinion of the data they reviewed.
The previous studies were also fact based. Perhaps not adjusted to please Google, but they had their facts.


----------



## Tom Robertson

James Long said:


> It sounds like Google does not want to guarantee the safety of their vehicles unless nobody other than the vehicle is allowed to drive. The opposition to manual controls where a human might take over, cause an accident and blame it on the car are beyond what they want to deal with.
> 
> As Tom correctly pointed out in previous posts, the activation of a driverless feature will lead to LESS attention to the road. If the car is driving the driver becomes another passenger doing whatever passengers do in cars. Reading or texting or emailing or sleeping. Whatever the law will allow. And even though the law (as California proposes) may require an alert driver ready to step in at a moment's notice, people will become complacent about the operation of the vehicle ... and when (not if) the car needs help with driving the driver will be less ready to operate the vehicle. Over time people will lose driving hours and have less experience behind the wheel. When there is a problem one may wish for more experience behind the wheel ... that is not the way the law is going.
> 
> Best case scenario for the car manufacturers is to throw out the steering wheel. Not allow the passengers to drive. Take full control of the vehicle and not expect a driver to be ready or able to interrupt the computer's more intelligent and more experienced (in Google's opinion) thought process. But with that control comes responsibility ... so best case scenario also comes with caveats to take the manufacturer off the hook for any accidents the car does cause.
> 
> It seems that the manufactures want the best of both worlds.


Well said.

If (and I'm not sure if this is Google's thinking) Google thinks normal folk will immediately buy a car without a steering wheel, I think that would be myopic. I agree with many here that most people will still want a steering wheel, regardless of how quickly they will come to actually never use it. I also agree that there are times where I might drive off road whereby I would still want means to efficiency and more directly control the car. Joystick? Steering wheel? I think either will work, though we're all used to steering wheels. 

Here is an example: in Green Bay (and likely many places) homeowners can use their front, side, and backyards as parking for events at Lambeau Field. So for 10 games a year, plus playoffs, an unlined field, potentially snowcovered, can be used as a temporary parking lot. While Google might find a way to program for such, if feels as though it would still be easier to let the human guide the car rather than have the car guess on the proper parking location based on verbal and visual cues by the humans.

Then again, who knows. Maybe a touch screen display of the location might let the human say, "right here," along with a, "we need you to back in." 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> It is an opinion, no joke. The latest study is giving the writer's opinion of the data they reviewed.
> The previous studies were also fact based. Perhaps not adjusted to please Google, but they had their facts.


And further analysis of the other study discovered factual flaws. Leading to a flawed opinion, in my opinion.

Whether or not Google "was pleased" with the results, I think we all want the best data we can reasonably get. Not fear inciting mistakes from the first study. I was amazed at the numbers, yet didn't see the obvious flaw. (Then again, I wasn't an author to the study, responsible for accuracy.)

Peace,
Tom


----------



## James Long

Both studies "showed their work", so to speak.

"While I generally question studies at the behest of interested parties, at least this one has some sanity in the thought process." It is easier to agree with a study that agrees with your viewpoint.

The second study attempted to take the first study's "accidents per mile driven" and express it as "reported accidents per mile driven". There are still more reported accidents per mile driven on the driverless cars ... that fact has not changed and I do not believe is being disputed. The dispute is that more non-driverless car accidents should be reported. Other studies show that accidents are underreported ... so this study assumes that non-driverless accidents are underreported enough to make driverless cars safer in comparison.

Extrapolation of data. Data both surveys had.


----------



## Tom Robertson

James Long said:


> ...The second study attempted to take the first study's "accidents per mile driven" and express it as "reported accidents per mile driven". There are still more reported accidents per mile driven on the driverless cars ... that fact has not changed and I do not believe is being disputed. The dispute is that more non-driverless car accidents should be reported. Other studies show that accidents are underreported ... so this study assumes that non-driverless accidents are underreported enough to make driverless cars safer in comparison.
> 
> Extrapolation of data. Data both surveys had.


The second study clarified that the first study was flawed by using "reported accidents per mile" as anything meaningful for safety. I can hear Bill Maher: "yeah, 'voluntarily reported accidents per mile' is a safety factor I rely on...". 

Actually, I really hear how George Carlin might say it--and know I shouldn't repeat such language... 

So either the first study is flawed by using facts out of context, leading to disingenuous and misleading conclusions. And ultimately used apples to oranges comparisons.

Thus the second study seems to have attempted to correct the flaws of the first study. Yes, I realize that is a passive statement--I haven't read the study so I can't comment on their exact methodology. That corrections were necessary makes sense. When I use NTA self reported numbers, I get the same numbers the first study did. Now that I see the fallacy of that approach, I can make some sense of why the second study, even though commissioned by Google, has a chance of being a better analysis than the first.

Is the second study flawless? Heck, I don't know. Is it better than the first--my opinion is it is based on an article based on the study. It also seems to make more sense considering how underreported we all know voluntary accident reports are.

Does that mean I agree the final numbers are exact? No.  Closer and better, yes. Exact? Yeah, right.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> Thus the second study seems to have attempted to correct the flaws of the first study. Yes, I realize that is a passive statement--I haven't read the study so I can't comment on their exact methodology.


It is always good to argue about what you have not read. :rolling:

As previously stated, I trust both studies are fact based interpretations. The most recent seems to be more theoretical in attempting to guess that the number of unreported accidents is high enough to tilt the scales the other way.

The bottom line for me, as stated since this thread began, more study is needed. More miles need to be driven. And quite frankly, in addition to what I have said before, I believe a driverless car needs to be at fault in an accident or cited for a moving violation.

I was a perfect driver before my first traffic citation. 100% of my miles were incident free. Google cars are still in driver's ed ... limited to certain roads at certain speeds under certain conditions. Perhaps they need to put a big "L" on the back to caution drivers that the car is still learning? Perhaps the dome does not make it obvious enough.

I became a better driver after I stopped being perfect.


----------



## Tom Robertson

James Long said:


> It is always good to argue about what you have not read. :rolling:.


Right back at ya! You've read either study? :rolling:



James Long said:


> As previously stated, I trust both studies are fact based interpretations. The most recent seems to be more theoretical in attempting to guess that the number of unreported accidents is high enough to tilt the scales the other way.


Have you read the second study? Share the link?

You know they intended to tilt the scales? Share the evidence?

As previously stated, I can imagine how Bill Maher would feel about a safety comparison based on voluntary reported accident data--where there is a strong reason to under report.



James Long said:


> The bottom line for me, as stated since this thread began, more study is needed. More miles need to be driven.


Well then, I'm glad you agree that Google should continue their testing. :rolling:



James Long said:


> And quite frankly, in addition to what I have said before, I believe a driverless car needs to be at fault in an accident or cited for a moving violation.


New word: Schadenfailure? You want the driverless cars to have an accident? Isn't that clinging to horse and buggy thinking, "them fancy, new fangled automobile thingies gotta fail before we can trust them..." :grin:

One hopes "da bears" lose. One does not hope for lives to be at stake as cars fail. Bad form.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> You know they intended to tilt the scales? Share the evidence?


Do you believe the claim in the title is misleading? Do you believe that the summary in the article is misleading? If so, you probably should not have shared the article.

Or at least read the underlying research before defending it so vehemently in the past few posts.

Here is your link ... have fun:
Automated Vehicle Crash Rate Comparison Using Naturalistic Data
(I'd summarize it for you, but the homework hotline is closing for the night.)

I will highlight this line from their summary:
"Low exposure for self-driving vehicles (about 1.3 million miles in this study) increases the uncertainty in Self-Driving Car crash rates compared to the SHRP 2 NDS (over 34 million miles) and nearly 3 trillion vehicle miles driven nationally in 2013 (2,965,600,000,000)."

As I stated earlier, the data is not in. More data is needed.

No computer is flawless ... the relentless belief that self driving cars will never fail is troublesome. It is the kind of belief that gets people hurt. We learn from failures.


----------



## inkahauts

I think that if all cars where self driving the accident rate would fall tremendously. The problem is we will never see that day. So that will always make it very difficult to simply say one way or the other if self driving cars actually get in more or less accidents without all kinds of asricks about what caused it and what else lead to the accident that wouldn't have happened had all other cars been self driving as well.


----------



## Tom Robertson

inkahauts said:


> I think that if all cars where self driving the accident rate would fall tremendously. The problem is we will never see that day. So that will always make it very difficult to simply say one way or the other if self driving cars actually get in more or less accidents without all kinds of asricks about what caused it and what else lead to the accident that wouldn't have happened had all other cars been self driving as well.


Once a significant level of population uses driverless technology, we'll see the accident rate fall tremendously. 

Peace,
Tom


----------



## Drucifer

inkahauts said:


> I think that if all cars where self driving the accident rate would fall tremendously. *The problem is we will never see that day. *So that will always make it very difficult to simply say one way or the other if self driving cars actually get in more or less accidents without all kinds of asricks about what caused it and what else lead to the accident that wouldn't have happened had all other cars been self driving as well.


Vehicles in the future will have so many built in safety feature, that the joy of driving could disappear for most.


----------



## Tom Robertson

James Long said:


> ...
> BTW: You are welcome for my efforts in pointing you to the link you posted to the study and even mentioning it in my previous post. I try to help.


The full study is 88 pages long.  The table of tables is two pages long!  It's gonna take me a bit to digest the whole. 

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> Once a significant level of population uses driverless technology, we'll see the accident rate fall tremendously.
> 
> Peace,
> Tom


Personally I would not get too invested in the idea that accident rates will fall "once a significant level of population uses driverless technology." I'm not clear on what "significant" means and whether that will happen in the way the dreamers see it. There's a reason i posted this...



phrelin said:


> And now we have this from the Washington Post *Road predictions for 2050: The end of gasoline, traffic deaths and gear heads* which is interesting albeit somewhat amusing to a guy who visited the World of Century 21 exhibit at the 1962 Seattle World's Fair known as the Century 21 Exposition. Now where the heck did I park my gyrocopter.... :grin:


...just as I've posted about the related Smart Cities framework. Gyrocopters which were predicted to become the main mode of transportation at the 1962 World's Fair do exist as you can see from this YouTube video:

[youtubehd]G-4Y727dY4M[/youtubehd]

There were studies and analysis and the hardware actually existed. But it's 50+ years later and some things just never quite worked out. The self-driving car has a greater chance of success because it is just an incremental improvement on what we have. But to achieve the goals being touted will require a whole lot more than making a better crash avoidance system - interactivity is going to be important, among other things. I'm not expecting to see significant gains in what life remains to me. But maybe by 2050 there will be some.


----------



## Tom Robertson

Phrelin, since you remember the World's fair in 1962, I'm thinking it fair to say you are older than I. I was only 1 then. So I don't know if you will see significant adoption of driverless cars whilst you are still here, continuing to share your insights and wisdom. Thank you for sharing.

Yet, I do hope you will personally see at least one advantageous statistic from driverless cars. That you will always be able to hop in your car and let it drive you more safely than you could drive yourself at that point. That you might someday be sharing your insights and wisdom whilst your car is taking you to see family, friends, or something else fun. That you personally are safer even though we haven't fully hit the critical mass whereby the overall safety is significantly lower. 

And then it would be cool if you did also get the self driving gyrocopter. 

Namaste and peace,
Tom


----------



## dennisj00

Other than being visually interesting, I'm not sure the gyrocopter analogy is meaningful. General aviation has always been a niche market for those that could afford it. I got my private license in the mid 80s but quickly realized it didn't fit my budget.

And while we won't all be driving Teslas at $100k, the technology for self driving cars won't make a $15k Corolla cost $100k.

The adoption rate will be much faster.


----------



## phrelin

I truly do not know how I missed this, but CalTrans District 1 shared it on its Facebook page. From the _Huffington Post_ last July *This Fake Town Exists Solely To Test Driverless Cars* we learn:



> Mcity, an urbanized test ground for driverless cars, opened on Monday at the University of Michigan. It's a 32-acre simulated town with streets, intersections, traffic signs, buildings and sidewalks. There are also robotic human dummies, designed to anticipate all possible road obstructions and accidents. The site even incorporates faded lane markings and signs that have been scrawled over with graffiti.
> 
> With many in the industry expecting completely autonomous cars to hit the road within the decade, Mcity will play a key role in studying how these cars will be able to sense and react in time to potential pedestrian accidents. The facility will also develop ways for cars to identify nearby self-driving vehicles and traffic signals, and run them under various weather conditions, particularly the heavy snowfall common to Michigan winters, Bloomberg reported.
> 
> Mcity was developed in partnership by the university's Mobility Transformation Center, which researches technological advancements in transportation systems, and the Michigan Department of Transportation.


The referenced *Bloomberg Businessweek story* which was published in April last year tells us:



> A mother pushing a baby carriage jaywalks across a busy city street. Cutting between two parked cars and partially obscured by a bus, she edges her stroller into traffic before freezing as a speeding car bears down on her. Will the car stop in time? Or will it mow down mother and child? It doesn't really matter: The mom is a robot, and the car is a driverless vehicle cruising down a fake street in a mock town.
> 
> In coming years, federal, state, and city officials will have to decide how roadways should be designed, lighted, and controlled in a world with self-driving cars. Will road signs and traffic lights be necessary? What happens in a power failure? The search for answers is what led the Michigan Department of Transportation to pay $3 million of M City's construction costs. The university picked up the rest.
> 
> Already the university, automakers, and the National Highway Traffic Safety Administration are testing 3,000 Web-connected cars on regular Ann Arbor roads, monitoring their ability to communicate road congestion and local weather. M City research will eventually give those cars the ability to sense one another and nearby traffic signals. "On the controlled site we can test the failure of a traffic light," says Szabo. "In the real-life situation, you are certainly not going to make that happen."


There are some great pix in the _Huffington Post_ story.

I guess I have to acknowledge that self-driving cars will be on the roads in the 2020's, particularly if a CalTrans District Facebook pages is sharing stories on the subject.


----------



## Tom Robertson

Phrelin and I are both having aha moments today. (Thanks for sharing those MCity articles.)

Mine was how Ford is attacking one of the remaining big hurdles--snow, rain, fog, and other visibility problems. (And they are testing in MCity.) 

http://www.engadget.com/2016/01/11/ford-is-testing-autonomous-cars-in-the-snow/

Now, Ford's current solution relies on accurate 3D maps of the area so the car doesn't rely on visual road markers that might be hidden by snow. This then presents James concern about how and when will all the 3D mapping get done. What will a car do if in snow in an area no one has mapped in clear weather? Tough set of criteria to balance--how to handle new roads that are snow covered.

I agree with thems that the first round of consumer cars must maintain some human steering ability, some manual mode, until all the roads can be precisely mapped. Hopefully I'll never need to use it, yet it needs to be there until all the miles are mapped sufficiently. (Google probably has a very good idea of how long it would take and how much money it would cost--they've done it already though probably at not a sufficient precision for this project.) 

Peace,
Tom


----------



## Tom Robertson

Speaking of MCity, another article from Engadget: http://www.engadget.com/2015/11/13/ford-first-self-driving-mcity-michigan/

Peace,
Tom


----------



## James Long

MCity was linked earlier in the thread but after 300 posts it is easy to lose track of what has been mentioned.


----------



## phrelin

James Long said:


> MCity was linked earlier in the thread but after 300 posts it is easy to lose track of what has been mentioned.


Sorry. I did search this thread for "Michigan" and "MCity" and nothing came up.


----------



## phrelin

Here's why I'm a bit paranoid about the spin that comes from high sales volume tech retail industry when it comes to things that can kill you like a driverless car.

From the Silicon Valley Mercury News comes a story with the headline *Elon Musk: Tesla cars will be able to drive themselves across the country in two years*, which seems like it is a pretty cool story. It begins with:



> Tesla owners will be able to summon their driverless cars from across the country within two years, Tesla Motors CEO Elon Musk predicted Sunday.


But then it explains:



> "Summon" is a new feature the electric-car maker rolled out via software update over the weekend. The company touts it as a way to avoid having to "squeeze in and out of tight parking spots."


Huh? That's a little confusing. There is, however, a buried headline at the bottom that could be pulled from this:



> So is this it - a smooth road to fully autonomous cars? Not so fast. The Tesla software update also reined in Autopilot on residential streets, after its release in October prompted some Tesla owners to do what Musk has called "crazy things," including going hands-free and riding in the back seat, and of course sharing the videos to prove it. Autopilot will be restricted on residential streets without center dividers, and it has new speed limits as well, according to the Wall Street Journal.
> 
> "I might be slightly optimistic on this," he reportedly said Sunday night.


The headline is _Elon Musk aknowledges car owners can't be trusted with new self-driving tech_. That is the problem - not the tech, but the people, the same foolish, irresponsible people the tech is trying to save from killing themselves and others on the highway. :nono2:


----------



## James Long

We certainly have a variety of approaches ...

MCity's off road controlled environment.
Google's 25 MPH street testing.
Tesla's dump it on the market and see what happens.


----------



## Tom Robertson

James Long said:


> We certainly have a variety of approaches ...
> 
> MCity's off road controlled environment.
> Google's 25 MPH street testing.
> Tesla's dump it on the market and see what happens.


I see MCity and Google as being on the same progression. Google is taking a slow approach, they did their version MCity for years, by creating many off road testing environments and still uses them today. At some point MCity has to evolve to real city. Google reached that point and is still taking it slow... 25mph. (Though part of that is also California law.) 

Tesla on the other hand... Egad...

Peace,
Tom


----------



## inkahauts

I need to read about what tesla is actually doing. I thought their first step was basically pulling in and out of parking spaces for you basically. Obviously not....


----------



## dennisj00

Tesla updated to an assisted freeway driving back in November or early December, plus parking and pickup. Other than the crazy youtube videos, it looks pretty awesome. 

I have an acquaintance that parks his a couple of houses down the street. . . he owns a half dozen houses on a point on the lake. He can summon the car from that garage to his location.

For $800 he can get a robotic arm for the charger to automatically plug / unplug the charger. At that point, the car will drop him off, go to the garage and re-charge.


----------



## James Long

Tesla sounds like a can of worms waiting for a lawsuit. When an empty car drives itself someone must be responsible. Even if an accident is not the car's "fault" there will be a lot of questions about responsibility and accident avoidance.

So when you send your car off to park itself, are you responsible or the manufacturer?


Mythbusters did another cell phone driving episode (anti, of course) at the end of last season. They finished up with 30 test drives in a simulator. (The test was if handsfree was as bad as handheld.) I noticed that during the simulation that there were a lot of "what if" tests thrown in. What if a car passes, cuts you off and brakes? What if a bicyclist leaves the curb and crosses the center line?

If these were tests of a self driving car I suspect the proponents would classify those actions as "not the car's fault". But avoiding the more challenging fellow drivers on the road is part of driving. If cell phone users are to blame when someone pulls out in front of them or cuts them off, so should other drivers.


----------



## Tom Robertson

Google is looking for more partnerships: http://news.yahoo.com/google-add-more-partners-self-driving-cars-google-215204256--finance.html



> John Krafcik, the newly hired president of the Google self-driving car project, did not mention any automakers by name. However, appearing at a media conference at the Detroit auto show, Krafcik surveyed a room packed with hundreds of auto industry executives and said: "We hope to work with many of you guys."
> 
> ...
> "No one goes this alone," Krafcik said. "We are going to be partnering more and more and more." He said he hopes to form more alliances this year.


Peace,
Tom


----------



## Tom Robertson

GM is saying fully driverless cars most likely (my paraphrase of the updated version) to be in urban ride share programs before showroom: http://mashable.com/2016/01/13/gm-lyft-autonomous-car-austin/#p01XnFrEeSqC

He's probably right, as a rideshare could justify the costs of the hardware necessary. Yet well to do early tech adopters might see it as cheaper than a chauffeur. 
Peace,
Tom


----------



## phrelin

I was thinking that a self-driving vehicle for the Meals-on-Wheels program could use a *Segway Butler* variation to deliver the food right to the door. :grin:


----------



## inkahauts

phrelin said:


> I was thinking that a self-driving vehicle for the Meals-on-Wheels program could use a *Segway Butler* variation to deliver the food right to the door. :grin:


Now this is an idea worth looking into.


----------



## phrelin

InformationWeek has this article *Obama Proposes $4 Billion Budget For Self-Driving Cars* summarizing information gathered mostly from other sources:



> President Obama proposed Thursday a $4 billion budget to accelerate pilot program testing of self-driving vehicles over the next decade, in a move to spur acceptance of these vehicles on the nation's highways.
> 
> Under the proposal, Obama is aiming to bring federal regulators, state government officials, and car manufacturers together to craft a national policy that could fast-track the arrival of driverless cars on the nation's roads, according to a Washington Post report.


The referenced *Washington Post article* is lengthy but worth a read telling us among other things:



> "We are bullish on automated vehicles," [Transportation Secretary Anthony] Foxx said. "Today's actions and those we will pursue in the coming months will provide the foundation and the path forward for manufacturers, state officials and consumers to use new technologies and achieve their full safety potential."
> 
> The plan laid out by Foxx in a speech at the Detroit Auto Show foresees an active federal role in promoting high-tech innovations in an evolution toward self-driving cars that will take several decades to complete.
> 
> Foxx said the National Highway Traffic Safety Administration will work with automakers and state governments to develop prototype laws and regulations for state lawmakers to consider.


This isn't the kind of initiative that will get undone when a new President arrives next January. The lobbying combination of the auto and tech industries will make sure it continues no matter what party the new President comes from.


----------



## SayWhat?

Skipping 300+ posts since I wasn't really interested, but have come up with a question that may or may not have been addressed.

How are they dealing with drivers? I can see this coming in handy for older people (or anyone) with less that ideal vision that may not be able to get a standard driver's license. People that may be otherwise mobile and functional once they get to the store or workplace, but can't see well enough to drive.


----------



## phrelin

SayWhat? said:


> Skipping 300+ posts since I wasn't really interested, but have come up with a question that may or may not have been addressed.
> 
> How are they dealing with drivers? I can see this coming in handy for older people (or anyone) with less that ideal vision that may not be able to get a standard driver's license. People that may be otherwise mobile and functional once they get to the store or workplace, but can't see well enough to drive.


As someone who could face that problem, I hope assisted driving systems or self-driving vehicles will provide options and we have discussed the subject here. But right now I'm not holding my breath even here in California.


----------



## SayWhat?

Guess they're probably looking at it as advanced Cruise Control where a licensed driver has to be there to take over if some problem develops?


----------



## Tom Robertson

SayWhat? said:


> Skipping 300+ posts since I wasn't really interested, but have come up with a question that may or may not have been addressed.
> 
> How are they dealing with drivers? I can see this coming in handy for older people (or anyone) with less that ideal vision that may not be able to get a standard driver's license. People that may be otherwise mobile and functional once they get to the store or workplace, but can't see well enough to drive.


This is a hot topic, both in this thread and in the transportation industry. Google has determined that the only solution is one where the car can be fully autonomous, at least during most phases. Anytime a human is required as a "safety net" ignores human behavior. 

California has decided to ignore human behavior initially and require the humans be present, awake, and aware during the next phase of implementation. And separately certified for being present, awake, and aware. Though these new rules haven't been carved in stone yet. Public hearings coming soon.

Peace,
Tom


----------



## Tom Robertson

SayWhat? said:


> Guess they're probably looking at it as advanced Cruise Control where a licensed driver has to be there to take over if some problem develops?


That is the current draft of the State of California plan. Problem is Google already discovered that doesn't work. Humans don't stay awake and alert if they have nothing to do but sit and watch. If they aren't engaged, they will go to sleep. Google is pushing California to find a way to certify cars to truly be self-driving so the passengers don't have to fight normal human behavior.

Peace,
Tom


----------



## phrelin

TechRepublic offers a look at *Photos: A list of the world's self-driving cars racing toward 2020 and beyond.* The one without a real picture is:



> According to Elon Musk, Apple's plan for developing an autonomous car (dubbed "Project Titan") is an "open secret." What we do know is that Apple has purchased a 2100-acre testing ground, and according to The Wall Street Journal, plans to ship electric cars by 2019.


I'm not quite sure how Apple will ship those electric cars FedEx Air from their contract Chinese manufacturer's plant, but if they are shiny enough they may actually sell.


----------



## inkahauts

Haha! I wonder where they will be made...


----------



## phrelin

inkahauts said:


> Haha! I wonder where they will be made...


Maybe nowhere as according to the _Wall Street Journal_ *Apple Veteran Overseeing Electric-Car Project Leaving Company*.

What's interesting in the article is not that Steve Zadesky, who was an engineer at Ford until 1999 when he joined Apple, is leaving for personal reasons, which may be true. We are also offered some analysis:



> Mr. Zadesky, who worked on the iPod and the iPhone during his career, was given permission in 2014 to start investigating Apple's entry into the electric car market. Last year, Apple designated the initiative-code-named "Titan"- a committed project and set a ship date of 2019.
> 
> In Apple's parlance, a "ship date" doesn't necessarily mean the date that customers receive a new product; it can also mean the date that engineers sign off on the product's main features. Some team members expect that it might take several more years to get a differentiated electric car ready, the people familiar with the matter said.
> 
> The team has encountered some problems, according to people familiar with the matter, in laying out clear goals for the project. Apple has urged the team to push ahead with ambitious deadlines even though some on the team felt that those targets weren't attainable, these people said.


Learning to build an automobile isn't quite the same as building consumer electronics. Even Google now has an alliance with Ford. Whether the Apple culture could tolerate working with a normal auto manufacturer has been a question in my mind since I first heard about them getting into the self-driving vehicle business. Maybe they could take some of that offshore cash they have and buy Mitsubishi Motors from the Mitsubishi group.


----------



## phrelin

The Santa Rosa, California, _Press Democrat_ offered up this opinion piece *Close to Home: Allowing self-driving cars is a civil rights issue*: It's an interesting piece and, because it's California, the viewpoint will likely gain traction. Here are the two concluding paragraphs:



> The government has good reasons to proceed with caution. Just as no one wants drones interfering with airports or firefighting efforts, so no one wants driverless cars careening around out of control on our roads. We surely need a balance between caution and innovation.
> 
> What will surely push us toward innovation, however, is not the prospect of speeding up traffic jams or eliminating drunken driving - fine as those results might be. What we cannot ignore, or put off long, are the claims of disabled people to a life of as much independence as possible.


I know in this thread we've touched on the potential benefits to the disabled and elderly. But I have to admit, I never considered the legal and political aspects in the context of DMV rule-making.


----------



## Tom Robertson

phrelin said:


> The Santa Rosa, California, _Press Democrat_ offered up this opinion piece *Close to Home: Allowing self-driving cars is a civil rights issue*: It's an interesting piece and, because it's California, the viewpoint will likely gain traction. Here are the two concluding paragraphs:
> 
> I know in this thread we've touched on the potential benefits to the disabled and elderly. But I have to admit, I never considered the legal and political aspects in the context of DMV rule-making.


Great find, Phrelin. Those are interesting aspects.

Will California see a lawsuit to let handicapped people use driverless cars? Or will the courts require all townships and cities provide economical paratransit everywhere, anytime?

Peace,
Tom


----------



## James Long

Any handicapped person should have the same right to drive a "driverless" car as they do any other vehicle. It the vehicle is not within their physical capacity to operate that is not a case of discrimination.

Cars that are modified for the handicapped are still regulated by the state and their drivers still must maintain a valid driver's license (with appropriate restrictions).


----------



## Tom Robertson

James Long said:


> Any handicapped person should have the same right to drive a "driverless" car as they do any other vehicle. It the vehicle is not within their physical capacity to operate that is not a case of discrimination.
> 
> Cars that are modified for the handicapped are still regulated by the state and their drivers still must maintain a valid driver's license (with appropriate restrictions).


Interesting but rather unhelpful point of view. Like saying people in wheel chairs can use the front steps of buildings like anyone else so we don't need an ADA to force accommodations for others. "Don't know about them wheelchair ramps... they aren't as safe as steps..." 

At some point driverless cars will prove themselves to be as safe or safer than humans driving cars. At that point, if the government won't allow people to use driverless cars, they are inadvertently discriminating people who could use the technology. (At least I hope it is inadvertent.)

One could say, governments should wait before the cars prove themselves. My point is they should have all the regulations ready for the moment it happens--not start the final set of political negotiations until after the cars prove themselves. California is showing some ability to anticipate with regulations--yet they seem to be saying it will take 3 years--without actual evidence it will take 3 years.

Peace,
Tom


----------



## James Long

Tom Robertson said:


> Interesting but rather unhelpful point of view. Like saying people in wheel chairs can use the front steps of buildings like anyone else so we don't need an ADA to force accommodations for others.


No, it is not the same.

First of all, ADA does not require equal access. The front steps of the Lincoln Memorial and other monuments in Washington DC have no ramps. The steps of the Supreme Court of the US has no ramps. There is ADA access to all of these buildings ... but the front steps? No. I am sure you can find public and private structures in your own city that are fully compliant with ADA but have no accommodation for a wheelchair to use the front steps. Your example fails.

The proposed California law requires (my summary) that a driver be capable of taking control of the vehicle at any moment. That law is non-discriminatory. It does not require male drivers to be able to take control but not female drivers. It does not require black drivers to be able to take control but not Asian drivers. It requires a licensed driver to be behind the controls ready to take control of the vehicle at any time.

For ADA compliance a car that is capable of being operated by a handicapped driver could have driverless technology added in addition to any modifications needed for that specific driver. No discrimination in the law. The rule that a licensed driver capable of operating the vehicle be behind the controls applies to all. No discrimination.


----------



## Tom Robertson

James Long said:


> No, it is not the same.
> 
> First of all, ADA does not require equal access. The front steps of the Lincoln Memorial and other monuments in Washington DC have no ramps. The steps of the Supreme Court of the US has no ramps. There is ADA access to all of these buildings ... but the front steps? No. I am sure you can find public and private structures in your own city that are fully compliant with ADA but have no accommodation for a wheelchair to use the front steps. Your example fails.
> 
> The proposed California law requires (my summary) that a driver be capable of taking control of the vehicle at any moment. That law is non-discriminatory. It does not require male drivers to be able to take control but not female drivers. It does not require black drivers to be able to take control but not Asian drivers. It requires a licensed driver to be behind the controls ready to take control of the vehicle at any time.
> 
> For ADA compliance a car that is capable of being operated by a handicapped driver could have driverless technology added in addition to any modifications needed for that specific driver. No discrimination in the law.


Did I say the front steps had to be replaced? Please, read more cautiously. My comment was as if someone said they front steps alone were sufficient in providing access to the inside, not that they needed to be replaced. The ADA does require equal access--not necessarily the same doorways--as that is not the access to be equalized. The resources within are the real access to be gained.

Similarly, to deny any sufficiently proven technology because "the front steps are good enough for most people" is exactly the same thing. "All handicapped people have access to the front steps" is not giving them access to the resources. "All handicapped people must be capable of [needlessly] taking over the actual control of the car" is just as useless. They can't use the front steps so they are denied real access. They can't drive the car so they are also being denied real access.

There is a clause about affordability in the ADA--but since the government is not paying for the cars, that weasel clause doesn't apply.

Yes, *today* it is reasonable to require a capable driver--the cars are still in development. They won't be in development mode forever.

Peace,
Tom


----------



## inkahauts

Well do we actually have any laws sayings it's illegal in any way to use a drive less car today? Seriously I don't know.


----------



## James Long

Tom Robertson said:


> Yes, *today* it is reasonable to require a capable driver--


Thank you for your agreement.

When the time comes where a capable driver is no longer required I am sure the law will apply equally to the handicapped. No ADA red herring arguments needed.


----------



## James Long

inkahauts said:


> Well do we actually have any laws sayings it's illegal in any way to use a drive less car today? Seriously I don't know.


I believe California may be the first.

Most are operating in a gray area where what they are doing isn't specifically illegal. Most companies are protecting themselves from liability by having a licensed driver monitor the vehicle's operation and/or proactively clear the use of their vehicles for use on roads.

Tesla seems to be the "rogue" with automated features being released before specific laws permit or restrict such features. They have scaled back their releases to limit their liability. But if one tells their Tesla to go park itself, who is responsible for any accident the car is involved in while unoccupied?

I'd be responsible for my vehicle if I took my hands off the wheel or walked away from it while it was running. I expect that Tesla drivers would be held responsible for their vehicle's operation ... even without specific new laws.


----------



## phrelin

Tom Robertson said:


> Will California see a lawsuit to let handicapped people use driverless cars? Or will the courts require all townships and cities provide economical paratransit everywhere, anytime?


One of the more effective lobbying groups here is the Californians for Disability Rights. Then there's Disability Rights Advocates, one of the leading nonprofit disability rights legal centers in the nation centered in Berkeley, California, and New York City. My guess is that DMV will not want to alienate either. State and local government here are well aware of them.

Frankly I was surprised I was reading a piece in the _Press Democrat_. So I did a Google search only to discover that Disability.gov, the U.S. federal government website for information on disability programs and services, a link to a report from the National Council on Disability, a 2015 report on self-driving cars and what they could mean for people with disabilities *Self-Driving Cars: Mapping Access to a Technology Revolution* which discusses the potential legal barriers.

Seems the matter is already under scrutiny.


----------



## Tom Robertson

James Long said:


> Thank you for your agreement.
> 
> When the time comes where a capable driver is no longer required I am sure the law will apply equally to the handicapped. No ADA red herring arguments needed.


Seems like we already have an example of California disagreeing with your assessment that when capable drivers are no longer they will change their laws. They've actually decided to ignore any evidence and make the driver required regardless of driverless capability. So while the law applies equally (a red herring of its own) it does not affect equally. *Today* you aren't affected by being required to be driver capable. A blind person is greatly affected.

A _reductio ad absurdum_ example would be all green eyed people can't drive. The law applies to everyone yet only affects green eyed people (who simply need to get cornea transplants if they want to drive...)

Another red herring is suggesting regulation requiring two capable drivers, one human and one machine, is safer.

Peace,
Tom


----------



## Tom Robertson

phrelin said:


> One of the more effective lobbying groups here is the Californians for Disability Rights. Then there's Disability Rights Advocates, one of the leading nonprofit disability rights legal centers in the nation centered in Berkeley, California, and New York City. My guess is that DMV will not want to alienate either. State and local government here are well aware of them.
> 
> Frankly I was surprised I was reading a piece in the _Press Democrat_. So I did a Google search only to discover that Disability.gov, the U.S. federal government website for information on disability programs and services, a link to a report from the National Council on Disability, a 2015 report on self-driving cars and what they could mean for people with disabilities *Self-Driving Cars: Mapping Access to a Technology Revolution* which discusses the potential legal barriers.
> 
> Seems the matter is already under scrutiny.


Great link to the November, 2015 report. The executive summary and table of contents look very promising.

One quote in the Atitudinal Barriors section of the summary: 


> As required by Title II of the ADA, restrictions on AVs must be based on evidence of actual risk, not unsupported generalizations about the capabilities of people with disabilities.


Peace,
Tom


----------



## James Long

Tom Robertson said:


> Another red herring is suggesting regulation requiring two capable drivers, one human and one machine, is safer.


The car is not a capable driver. Perhaps in the investor's or futurist's dreams, but not today.
The day where a car is considered a capable driver by an authority isn't close. Not when the laws being introduced are becoming restrictive instead of permissive.


----------



## Tom Robertson

James Long said:


> The car is not a capable driver. Perhaps in the investor's or futurist's dreams, but not today.
> The day where a car is considered a capable driver by an authority isn't close. Not when the laws being introduced are becoming restrictive instead of permissive.


No one, but you, is arguing the car is or is not a capable driver *today*. 

Alas, California is proposing regulations that will require two capable drivers even after the car has become capable. The regulations don't determine the car is incapable, it defines it as such in spite of what will be reasonable evidence when.

Which, by the way, opens up California for an ADA lawsuit by judging handicapped persons as incapable of using a autonomous vehicle after evidence has shown they can. (Again, this is the future, when cars will be capable, yet the way the regulations are drafted, they won't be permitted.)

I'm curious how the California workshops will go this week and next. They could prove interesting. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> No one, but you, is arguing the car is or is not a capable driver *today*.


The phrasing of your statements seem to make that claim.
Thank you for agreeing that cars are not capable of driving themselves.


----------



## phrelin

Personally, I'm still grimacing when I read some writing using the term "autonomous" vehicle. But I have no doubt that we'll reach a time when a vehicle can do the "driving" between point A and point B in many areas. However, I do want someone to be in charge of determining what location constitutes point B and when the vehicle is supposed to start going there from point A, even if that someone monitors the vehicle remotely. I really don't want my car going out for a quick charge without my permission. I don't want to deal with another teenager.


----------



## James Long

phrelin said:


> Personally, I'm still grimacing when I read some writing using the term "autonomous" vehicle. But I have no doubt that we'll reach a time when a vehicle can do the "driving" between point A and point B in many areas. However, I do want someone to be in charge of determining what location constitutes point B and when the vehicle is supposed to start going there from point A, even if that someone monitors the vehicle remotely. I really don't want my car going out for a quick charge without my permission. I don't want to deal with another teenager.


I see your point!

I suppose the "autonomous" part will be you tell it where to go and it decides how to get there. Kinda like a glorified cab ride ... "take me to work, but avoid the expressway". Or "take me to church" ... no, don't play the song! DRIVE me to the church!

Once the vehicles learn what you mean they will be better at getting you where you want to go. And hopefully the conversation with the car will be clearer than the funny Siri conversations one can find online. And the GPS will follow actual roads and not instruct the car to turn where there isn't a road.

"Autonomous" is scary ... the machines need to be under human control. Even after we let them drive.


----------



## Tom Robertson

phrelin said:


> Personally, I'm still grimacing when I read some writing using the term "autonomous" vehicle. But I have no doubt that we'll reach a time when a vehicle can do the "driving" between point A and point B in many areas. However, I do want someone to be in charge of determining what location constitutes point B and when the vehicle is supposed to start going there from point A, even if that someone monitors the vehicle remotely. I really don't want my car going out for a quick charge without my permission. I don't want to deal with another teenager.


As the technology evolves and even moreso as we get used to it, we'll decide what we'll let the cars do "on their own", via general controls. While today you say you don't want the car to go get a charge while you don't know about it, someday you might be grateful your car remembered to "fill up" cuz you were too busy to.

Much like we tell our family, "I'll be right back," you'll ride somewhere and tell your car, "I'll be here a while." It will do the right thing based on knowing what you mean by that phrase. Which might be different from my "I'll be here a while." 

Does the car go park farther away when I'll be here a while? Get a charge? Pick up my wife from some trip she's on? Remind me she is scheduled for a trip, thus allowing me to release it to take her and come back to get me? We'll all have a new language with our car. Could be interesting and fun to watch. 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> I see your point!
> 
> I suppose the "autonomous" part will be you tell it where to go and it decides how to get there. Kinda like a glorified cab ride ... "take me to work, but avoid the expressway". Or "take me to church" ... no, don't play the song! DRIVE me to the church!
> 
> Once the vehicles learn what you mean they will be better at getting you where you want to go. And hopefully the conversation with the car will be clearer than the funny Siri conversations one can find online. And the GPS will follow actual roads and not instruct the car to turn where there isn't a road.
> 
> "Autonomous" is scary ... the machines need to be under human control. Even after we let them drive.


Sounds like the scary part is giving up control. We think we have control over technology now, many think they don't want to give it up. It takes time to trust some changes. Other changes, I'm guessing ones we predict or want, we embrace the change. I loved DVRs immediately--I saw how they were cool. Some technologies I see the advantage therein, yet am not sold the bugs are worked out. It will take me a few minutes to really embrace time travel, for instance. 

Autonomous/driverless/whatever they get called cars are closer to DVRs for me. While I recognize there are some major challenges still, the advantages powerfully speak to me. The Google videos have shown me how much closer they have come to being ready than I expected. I understand the technology (and, perhaps just as importantly the approach), so real concerns don't turn into fears--they fall away. I can mark them complete on my punchlist as each is solved, tested, verified.

No, my punchlist is not completely checked off. And in fact, if the only time I can't let the car drive itself is snow or fog for a couple years, I'm ok with that.  Though Ford is making progress on them too.

After all, we already are comfortable with driving around with 15-40 gallons of high explosive--because we solved most of the problems with it. Yet yesterday a friend stopped a car-b-que from becoming explosive by happening to be in the right place to spot it and extinguish the fire in time. Which, by the way, is another way we thought we were in control of the car, yet a broken fuel line and a spark reminds us we aren't in control--we have merely reduced the risk to acceptable levels.

Peace,
Tom


----------



## dennisj00

Wife doesn't use / trust the backup camera in our Leaf. Actually it's a 360 degree view and works great for parallel parking or any close maneuvering.

She may get use to it but it probably doesn't matter. Our next car will park itself! (and more!)


----------



## James Long

Tom Robertson said:


> It will take me a few minutes to really embrace time travel, for instance.


And then you will get those minutes back (depending on one's theory of time travel).

Or there could be consequences ...
http://www.dbstalk.com/topic/219079-time-travel-every-second-counts-or-does-it/

New technology is easier to accept when it is not a life or death decision. Sure, it is not a good day when the DVR dumps shows you have not watched (and the VCR never did that?). But nobody got hurt.



Tom Robertson said:


> And in fact, if the only time I can't let the car drive itself is snow or fog for a couple years, I'm ok with that.


The times when the car needs a driver more skilled than the car (or at least more willing to take risks) it will turn over control to a human who is getting less driving experience on a regular basis.

The risk is the key. As we discussed months ago when we were talking about liability ... manufactures and owners (responsible parties) are going to need to decide where to draw the line and say "you drive". Actuarial decisions based on a balance sheet of risks.


----------



## Tom Robertson

James Long said:


> And then you will get those minutes back (depending on one's theory of time travel).
> 
> Or there could be consequences ...
> http://www.dbstalk.com/topic/219079-time-travel-every-second-counts-or-does-it/
> 
> New technology is easier to accept when it is not a life or death decision. Sure, it is not a good day when the DVR dumps shows you have not watched (and the VCR never did that?). But nobody got hurt.
> 
> The times when the car needs a driver more skilled than the car (or at least more willing to take risks) it will turn over control to a human who is getting less driving experience on a regular basis.
> 
> The risk is the key. As we discussed months ago when we were talking about liability ... manufactures and owners (responsible parties) are going to need to decide where to draw the line and say "you drive". Actuarial decisions based on a balance sheet of risks.


Yes, for some the decision to ditch horse and buggy technologies was very difficult. Them cars go too fast, use explosives, catch fire, break down, etc. 

As for risk--as soon as autonomous cars prove themselves to be safer than human cars, they will go on sale--at least somewhere. (Probably not California, sadly.) 

As for liability--the insurance companies will sort it out. On day one, my insurance company will likely charge me exactly what they charge me the day before. I can live with that--I want the features, not worried about the financial savings initially. The savings will come soon enough as the cars save money and as they work out the liability split.

Peace,
Tom


----------



## phrelin

I must note that properly equipped airplanes can fly themselves, even take off and land. But pilots are still required. Sure it's scarier at 150+ mph or at 20,000 feet. But whether your plane flies into a cliff or your car drives over a cliff you're just as dead. The thing is that on the highway someone has to be monitoring what the self-driving car is doing, just like someone should be monitoring what the plane is doing on autopilot.

The definition not related to government at Dictionary.com for "autonomous" is "not subject to control from outside; independent." If we get a car that is autonomous, basically you have a really heavy robot with AI. I know that part of the hope from the tech research for these vehicles is to move closer to AI. But I think I'd rather have a humanoid robot driving my car at that point - perhaps one that can be plugged into the sensors on the vehicle.


----------



## Tom Robertson

phrelin said:


> I must note that properly equipped airplanes can fly themselves, even take off and land. But pilots are still required. Sure it's scarier at 150+ mph or at 20,000 feet. But whether your plane flies into a cliff or your car drives over a cliff you're just as dead. The thing is that on the highway someone has to be monitoring what the self-driving car is doing, just like someone should be monitoring what the plane is doing on autopilot.


The difference is currently autopilots can merely fly--they can't see. The human is needed to see until the necessary sensors aren't in planes yet. And 500 mph probably requires an order of magnitude (or more) of better sensing/computing than 80mph. (Then the sensors need to be integrated into the autopilot as comprehensively as Google does with cars.)



phrelin said:


> The definition not related to government at Dictionary.com for "autonomous" is "not subject to control from outside; independent." If we get a car that is autonomous, basically you have a really heavy robot with AI. I know that part of the hope from the tech research for these vehicles is to move closer to AI. But I think I'd rather have a humanoid robot driving my car at that point - perhaps one that can be plugged into the sensors on the vehicle.


Remember dictionary definitions are snapshots of human definitions and are already obsolete by the time the linguists decide what to use. If no one is guiding directly controlling their instantaneous movements, only their larger contextual movements, is that not autonomous?

Peace,
Tom


----------



## Cholly

Umm...it's now 2016 and the topic of this thread is about driverless cars hitting the road in 2015. Time to move on? :coffee


----------



## Tom Robertson

Cholly said:


> Umm...it's now 2016 and the topic of this thread is about driverless cars hitting the road in 2015. Time to move on? :coffee


We could start a new thread quarterly. Or annually.

Peace,
Tom


----------



## yosoyellobo

Cholly said:


> Umm...it's now 2016 and the topic of this thread is about driverless cars hitting the road in 2015. Time to move on? :coffee


Hey I am still waiting for my flying car ever since 1955.


----------



## phrelin

Tom Robertson said:



> As for risk--as soon as autonomous cars prove themselves to be safer than human cars, they will go on sale--at least somewhere. (Probably not California, sadly.)


I wouldn't rule out California just yet, although we won't pander to some automaker but probably would to Google. From the_ Sacramento Bee_ this morning *AM Alert: California jump-starts discussion on self-driving car rules*:



> With dozens of manufacturers pushing to get self-driving cars onto the road, California has grappled for the last several years to craft rules that protect public safety without hindering the development of a potentially life-saving technology.
> 
> When the Department of Motor Vehicles finally unveiled draft regulations in December, they significantly slowed the timeline for public availability of autonomous vehicles until the state is confident that they are safe. Most notably, the agency included a requirement that the cars have a steering wheel and a licensed driver ready to take over if they fail.
> 
> The DMV is set to hold one of two workshops to get public input on the draft, including training and privacy rules, starting at 10 a.m. at the Harper Alumni Center at Sacramento State.


As noted further in that article, the California-based group Consumer Watchdog which will be at that meeting is backing the DMV rules as noted in their news release today *Bikes, Pedestrians, Other Cars and Tree Branches Among Real-Road Scenarios Robot Cars Can't Handle* :



> "The need to require a licensed driver behind the wheel is obvious after a review of the results from seven companies that have been testing since September 2014: Robot cars are still not capable of dealing reliably with real-life situations," said John M. Simpson, Consumer Watchdog's Privacy Project director.
> 
> Under the autonomous car testing regulations, the companies were required to file "disengagement reports" explaining when a human test driver had to take control. The reports show that the cars are not always capable of "seeing" pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not capable of reacting to reckless behavior quickly enough to avoid the consequences, the reports showed.
> 
> "The companies' own evidence makes clear that a human driver able to take control of the vehicle is necessary to ensure the safety of both robot vehicles and other Californians on the road," Simpson said at a DMV workshop on autonomous vehicle regulations.


As a side comment I find it amusing that these folks call them "robot cars" and "autonomous vehicles" which allows them to create a more dangerous image than a "self-driving vehicle."

I see the real possibility of, say, a Ford vehicle with mostly standard "driver's side" controls using Google self-driving technology on California roads in the relatively near future. Someone is going to have to sit in that driver's seat for now and I think Google can make sure that "beta testers" actually understand what that term means.

It's an evolutionary process. Who knows how it will have evolved by 2030.


----------



## dennisj00

I have seen the future and the future is now!

Spent a thoroughly enjoyable couple of hours this morning test driving a Tesla Model S. I have never been more impressed with the quality and design of a car, how it drives and the AutoPilot was very impressive. It was much further along than I expected.

It did a great job at interstate speeds or 4 lane stop and go. It did feel weird to let the car take over but I'm sure after a week or so it would be very comfortable.

A friend is considering one or the Model X and we're lusting very heavily for one after driving it. It may work by dropping down to one car. We live about 10 minutes from a free charger and have 3 rental agencies within a few minutes on the few times we need two cars.

It was also a pleasant experience with the dealer - no high pressure 'what do we need to get you in one today' or 'let me go talk to the manager' -- just the facts on the car.


----------



## phrelin

dennisj00 said:


> I have seen the future and the future is now!
> 
> Spent a thoroughly enjoyable couple of hours this morning test driving a Tesla Model S. I have never been more impressed with the quality and design of a car, how it drives and the AutoPilot was very impressive. It was much further along than I expected.
> 
> It did a great job at interstate speeds or 4 lane stop and go. It did feel weird to let the car take over but I'm sure after a week or so it would be very comfortable.
> 
> A friend is considering one or the Model X and we're lusting very heavily for one after driving it. It may work by dropping down to one car. We live about 10 minutes from a free charger and have 3 rental agencies within a few minutes on the few times we need two cars.
> 
> It was also a pleasant experience with the dealer - no high pressure 'what do we need to get you in one today' or 'let me go talk to the manager' -- just the facts on the car.


They are remarkable automobiles, albeit still expensive.

When I can plug one into my home for charging but which will reverse the flow and power my home during a power outage - that would make it irresistible. Frustratingly the *Tesla Powerwall* home battery system is still available on a "reserve and we'll let you know" basis.


----------



## dennisj00

phrelin said:


> They are remarkable automobiles, albeit still expensive.
> 
> When I can plug one into my home for charging but which will reverse the flow and power my home during a power outage - that would make it irresistible. Frustratingly the *Tesla Powerwall* home battery system is still available on a "reserve and we'll let you know" basis.


There have been articles in the last decade (or more) that EVs battery storage could be a great supplement to the power grid since most of our current generation can not be stored. Tesla's Powerwall could also help that.

I've looked into the PW but at 10.5 cents per KWHr here, it doesn't compute yet. I'm giving a few KWHr back to my utility on sunny days from my solar panels because I've reduced my base load (LEDs, fewer computers) since I installed them.

I'm sure the PW will become more available once their battery plant is in production. That's also a big card in their hand to lower the cost of the vehicles.

Another problem here on the East coast is gas at $1.80 doesn't offset the cost like it did at 3 or $4. But as always, that won't last long.

One note about the dealership in Charlotte. They apparently took over a defunct dealership on East Independence Auto Row. . . several acres of parking - empty, nice dealership with a half dozen demo cars, a chassis and charging station to see and service department - also EMPTY.


----------



## Drucifer

One of the tech mags was reporting this change over in the auto industry is what happen when the telephone replaced the telegraph.


----------



## James Long

dennisj00 said:


> It was also a pleasant experience with the dealer - no high pressure 'what do we need to get you in one today' or 'let me go talk to the manager' -- just the facts on the car.


Dave Moody and callers were talking about Tesla test drives early this week (before the news about Tony Stewart took over the station). Perhaps their associates have become accustomed to the non-serious buyer doing a test drive. A lot of people like to give a Tesla a drive just to say they have driven one ... and getting more people acquainted with the product can't hurt. Perhaps it won't make this sale - but it will build toward the actual sales down the road.


----------



## inkahauts

I believe a lot also has to do with them being company owned. I don't believe they are all on commission. They are there more to present the car and let the car sell itself. Kind of like how Saturn approached the sales with no haggling. 

They realize the entire experience needs to be great to get word of mouth to push their cars for them.


----------



## dennisj00

James Long said:


> Dave Moody and callers were talking about Tesla test drives early this week (before the news about Tony Stewart took over the station). Perhaps their associates have become accustomed to the non-serious buyer doing a test drive. A lot of people like to give a Tesla a drive just to say they have driven one ... and getting more people acquainted with the product can't hurt. Perhaps it won't make this sale - but it will build toward the actual sales down the road.


Trust me, I'm serious about buying one of these cars. As an owner of an EV for over 4 years, I'll never buy an ICE vehicle again.

You have to realize these cars can drive around 20 or more miles for the electricity used in distilling a single gallon of gas.

If half of us jumped on board, it would do a big favor for the planet.


----------



## James Long

Where does electricity come from? (2014)
Coal = 39%
Natural gas = 27%
Nuclear = 19%
Hydropower = 6%
Other renewables = 7%
(Biomass = 1.7%, Geothermal = 0.4%, Solar = 0.4%, Wind = 4.4%)
Petroleum = 1%
source: US Energy Information Administration

"In 2013, the electricity sector was the largest source of U.S. greenhouse gas emissions, accounting for about 31% of the U.S. total. Greenhouse gas emissions from electricity have increased by about 11% since 1990 as electricity demand has grown and fossil fuels have remained the dominant source for generation."
source: United States Environmental Protection Agency

Hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), and all-electric vehicles (EVs):
"PHEVs and EVs typically have a well-to-wheel emissions advantage over similar conventional vehicles running on gasoline or diesel. In regions that depend heavily on conventional fossil fuels for electricity generation, PEVs may not demonstrate a well-to-wheel emissions benefit."
source: United States Department of Energy

There is a calculator at the third site where one can determine if they are in a region where PEVs demonstrates a benefit or not. The national average shows a benefit for emissions.

BTW: Here is an article from 2010 about what can be done with the batteries:
http://www.nytimes.com/2010/06/13/automobiles/13RECYCLE.html
And the Department of Energy's current information on batteries:
http://www.afdc.energy.gov/vehicles/electric_batteries.html


----------



## phrelin

James Long said:


> Hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), and all-electric vehicles (EVs):
> "PHEVs and EVs typically have a well-to-wheel emissions advantage over similar conventional vehicles running on gasoline or diesel. In regions that depend heavily on conventional fossil fuels for electricity generation, PEVs may not demonstrate a well-to-wheel emissions benefit."
> source: United States Department of Energy
> 
> There is a calculator at the third site where one can determine if they are in a region where PEVs demonstrates a benefit or not. The national average shows a benefit for emissions.
> 
> BTW: Here is an article from 2010 about what can be done with the batteries:
> http://www.nytimes.com/2010/06/13/automobiles/13RECYCLE.html
> And the Department of Energy's current information on batteries:
> http://www.afdc.energy.gov/vehicles/electric_batteries.html


I hadn't seen the calculator before and it is good to confirm that in my part of the world we can have a substantial well-to-wheel emissions advantage. Now the only problem is that most of what little mileage we put on our cars involves round trips in excess of 30 miles one way.

I have concerns about the batteries.

In my years of experience with various kinds of solid waste and recycling, I have seen a lot of abuse to cut cost corners particularly when the recycling product is in surplus on the market. Where there is no real enforcement system, problems arise. Of course in our country, due diligence is the watchword unless, of course, you're drinking water in Flint or the over 1,000 cities in violation of EPA drinking water standards.

And as with all new environmental situations we really don't know much about our new battery technology. In the past it has taken decades to understand what the impact of our decisions have on the Earth. Here's one example story *A lack of battery recycling could be hurting this important bacterium*.

I've never really understood our need to assume that every new technology development is a perfect solution to all relevant problems particularly when the key word is "new" and the promoters primary objective is to make a lot of money. We really have no idea what the long-term impact of using large quantities of lithium will be on our environment. We just know it will be better than that our great-grandfathers thought would be the perfect solution to....

I know, I know, I'm just a grumpy old man who doesn't appreciate the brilliance of the next generation. :sure:


----------



## yosoyellobo

You are not grumpy Phrelin.


----------



## phrelin

Today's Sacramento Bee - the Capitol's politically influential newspaper - has a very good, comprehensive story *Sacramento Bee's transportation reporter puts Google's self-driving car to the test* that I think indicates the future pace of implementing self-driving technology here. It's pretty positive in a realistic way.


----------



## dennisj00

phrelin said:


> Today's Sacramento Bee - the Capitol's politically influential newspaper - has a very good, comprehensive story *Sacramento Bee's transportation reporter puts Google's self-driving car to the test* that I think indicates the future pace of implementing self-driving technology here. It's pretty positive in a realistic way.


Good article and in 25 words or less, pretty much matches my time with the AutoPilot on the Tesla. Of course, I didn't leave the driver's seat and hands and feet were at the ready while the car was driving.

We did have a car pull out of a side road about 50' in front of us and the car made a nice slow-down and transition back to following it about 2 car lengths (~45 mph).

I'd love to have a week or so with one. Maybe in 3 or 4 months.

As far as embracing new technologies, we certainly substitute a new set of problems with each major change- hopefully a smaller, more manageable set of problems, but I can't imagine our country with wood or coal fired heating in every home - I remember a lot of that as a child - or what our cities would look like if we still used horse and buggies. We'd be recycling horse manure and worrying about their methane problem!

I've had a Hybrid with Metal Hydride batteries since 2007 and at maintenance last month they continued to pass with no major degradation - ~90,000 miles. Tesla warrants batteries at 8 years or 80,000. Both EVs - 2.5 years / 2 years did very well with lithiums.

All of these batteries could have a second life with their reduced capacity in your home storage. Or re-cycled back into new technology.

Recycling sounds like the answer to everything until I read that only 50% or what is recycled in New York City is actually re-cycled!


----------



## phrelin

dennisj00 said:


> Recycling sounds like the answer to everything until I read that only 50% of what is recycled in New York City is actually re-cycled!


About 25 years ago I was the first chairman of the first solid waste planning agency in Mendocino County, California. We began the process of establishing governmental implementation policies at the local level for both recycling as well as landfill/disposal sites. When I made the comment about me being a grumpy old man, in the case of recycling it is because I've watched the process become a battle between the regulatory/subsidies effort versus economic pressures. And let's be honest, the various "Silicon Valleys" represent a huge source of funds for lobbying and disinformation for the new corporations which have replaced the old steel, textile, etc., economic interests of the early-to-mid 1900's.

IMHO either an iPhone should cost at least 50% more in taxes, or a big chunk of that cash Apple has stored offshore to avoid taxes should be "obtained", to be used to understand the long term biological impacts of and assure the proper handling of Li-ion batteries ((I'm using Apple as a stand-in for all companies that produce products using these batteries which range from cell phones, computers and other "personal devices" to vehicles, power tools, aviation systems. etc.) If you look at the *Wikipedia entry* on the subject, you almost get the impression that these batteries are benign:



> Since Li-ion batteries contain less toxic metals than other types of batteries which may contain lead or cadmium they are generally categorized as non-hazardous waste.


That they are "generally categorized as non-hazardous waste" by governmental regulators around the world is alarmingly true. And we are regularly assured everything is just fine. This is a lie is being propagated IMHO for economic reasons.

In 2013 Dr. Daniel Hsing Po Kang (of the School of Social Ecology, University of California, Irvine), Dr. Mengjun Chen (of the Key Laboratory of Solid Waste Treatment and Resource Recycle, Southwest University of Science and Technology, Mianyang, China) and Dr. Oladele A. Ogunseitan (Department of Population Health and Disease Prevention, University of California, Irvine) published a study funded by the American Chemical Society entitled *Potential Environmental and Human Health Impacts of Rechargeable Lithium Batteries in Electronic Waste* (note: the link is to a PDF summary file) which concluded:



> Using standardized leaching tests, hazard assessment models and other methods for evaluating hazardous waste, the scientists showed that Li-ion batteries from cell phones would meet federal government definitions of hazardous waste because of lead content. California standards would classify them as hazardous due to cobalt, copper and nickel content.


They didn't address the problem of lithium because "the relative contribution of aluminum and lithium to human toxicity and ecotoxicity could not be estimated due to insufficient toxicity data in the models." But it isn't like we know nothing about lithium.

The more common worries (other than explosion) about lithium from an industrial processing standpoint are well-known. Breathing lithium dust or lithium compounds (which are often alkaline) initially irritate the nose and throat, while higher exposure can cause a buildup of fluid in the lungs, leading to pulmonary edema. The metal itself is a handling hazard because of the caustic hydroxide produced when it is in contact with moisture. From a physiological standpoint, there is significant research regarding lithium because of the lithium salts used as a psychiatric medication. Common side effects include increased urination, shakiness of the hands, and increased thirst. Serious side effects include hypothyroidism, _diabetes insipidus_, and lithium toxicity. If blood levels become too high diarrhea, vomiting, poor coordination, sleepiness, and ringing in the ears may occur. And, of course, they may increase the risk of developing Ebstein's cardiac anomaly in infants born to women who take lithium during the first trimester of pregnancy. Given this information it's startling that we don't have major research programs on the addition of large volumes of lithium in our environment.

The standard American approach to implementing new technology is to (i) embrace the new and (ii) ignore the long term "side" effects and regulatory problems until ...ah... the long term has passed when, if we are going to have created a problem, we will have created a problem. And we've pretty much done this in the case of lithium ion batteries. The researchers point out point out that:



> ...Li-ion batteries have become mainstays for powering everything from smart phones to components in new jetliners, with global sales approaching $8 billion annually. They realized that the short life span (2-4 years) of Li-ion batteries in portable electronic devices would make a huge contribution to the electronic waste problem, which already is the fastest growing form of solid waste.


I believe I should be a grumpy old man about this subject. Adding millions of vehicles powered by these batteries is a legitimate worry.


----------



## dennisj00

phrelin said:


> I believe I should be a grumpy old man about this subject. Adding millions of vehicles powered by these batteries is a legitimate worry.


These batteries will be 'contained' and re-cycled / re-used much more than personal item batteries (phones, laptops, etc.)

I don't think it will ever approach the problems of gasoline - remember the lead problem of the 50s/ 60s ? Or the ground problem at every corner gas station. Go get an appraisal at any gas station or truck stop property and see how much cleanup is required. Or even a junk yard with lead-acid batteries.

Off to watch the Super Bowl - Go Panthers!!


----------



## Tom Robertson

phrelin said:


> ...[Excellent commentary redacted for space reasons--please read phrelin's original post!]
> The standard American approach to implementing new technology is to (i) embrace the new and (ii) ignore the long term "side" effects and regulatory problems until ...ah... the long term has passed when, if we are going to have created a problem, we will have created a problem. And we've pretty much done this in the case of lithium ion batteries. The researchers point out point out that:
> 
> I believe I should be a grumpy old man about this subject. Adding millions of vehicles powered by these batteries is a legitimate worry.


First off: thanks for the excellent history lessons.

My general feeling was there will be a full ecology develop around the batteries as more and more cars use them. You've slightly adjusted my thinking by reminding me that one healthy role for government would be to facilitate that ecology's development through a mixture of taxes, incentives, and regulations. Let's make it happen.

Peace,
Tom


----------



## James Long

dennisj00 said:


> I don't think it will ever approach the problems of gasoline - remember the lead problem of the 50s/ 60s ? Or the ground problem at every corner gas station. Go get an appraisal at any gas station or truck stop property and see how much cleanup is required. Or even a junk yard with lead-acid batteries.


Society has made a few mistakes. Hopefully society is getting better at seeing the mistakes before they are made. It certainly is getting harder to throw anything away without being considered a criminal.


----------



## Drucifer

James Long said:


> Where does electricity come from? (2014)
> Coal = 39%
> Natural gas = 27%
> Nuclear = 19%
> Hydropower = 6%
> Other renewables = 7%
> (Biomass = 1.7%, Geothermal = 0.4%, Solar = 0.4%, Wind = 4.4%)
> Petroleum = 1%
> source: US Energy Information Administration
> 
> [SNIP]


It also takes energy to produce gasoline from oil.

That's never taken into consideration when vehicles are compared.


----------



## phrelin

The American love affair with the automobile is truly an irrationality. Over the years I've said and posted this, but I'll do it again.

Imagine you are an intelligent species from another solar system looking for other intelligent species. You arrive on Earth, specifically the United States, and notice that individuals of the most intelligent species on the planet climb alone or with their offspring into a vehicle weighing many tons and travels in it on hardened paths where such vehicles travel in opposite directions, towards each other, at a mile a minute, separated only by a line painted on the path.

I'm sure electric vehicles will improve the environmental impact of the individual commuter over gasoline powered vehicles. Of course, both are significantly more harmful than walking, riding a bicycle, or taking mass transit. Unfortunately, like most Americans I like driving my car to places and always have. In the big long term picture it's a poor choice. But I have made poor choices regarding the environment regularly over my 70+ years. I have a feeling my grandchildren - and definitely my great-grandchildren should there be any - are going to feel some resentment about me and my generation of Americans. I have to recognize that the argument between the two types of vehicles is one that can be summarized as "I intend to harm the environment, but my method is less harmful than yours."

I didn't drive anywhere today, so I guess that's good. In the meantime I have to pay my really, really big electric and propane bills. What can I say?


----------



## dennisj00

Drucifer said:


> It also takes energy to produce gasoline from oil.
> 
> That's never taken into consideration when vehicles are compared.


I've mentioned this earlier in this thread and others on this board. A quick Google shows a range of 3 to 7.5 KwHr to produce a gallon of gas. Elon Musk used 5 as an average and his Tesla will go about 20 miles on 5 KwHr.

My Leaf currently averages 4.2 Miles per KwHr so I can drive 21 on that amount of electricity. That's probably more than the average MPG of cars on the road!

The other crazy statistic is that less than 10% of the energy in that gallon of gas goes to propel the car! The rest is waste heat which is contributing greatly to our global warming problem.


----------



## James Long

dennisj00 said:


> My Leaf currently averages 4.2 Miles per KwHr so I can drive 21 on that amount of electricity. That's probably more than the average MPG of cars on the road!


The average for all "light duty vehicles" on the road has been just over 21 MPG for about a decade.
The average for model year 2014 passenger cars was 36.4 MPG.
(Your Leaf would probably be better compared against a new car than every car on the road.)

BTW: In 1980 the average light duty vehicle on the road was 14 MPG with 24.3 MPG for 1980 model year cars. We (as a country) are showing improvement.

Do you burn more electricity on cold days to keep yourself warm in your Leaf? Redirecting engine heat helps keep my vehicle warm in some pretty severe temperatures.


----------



## inkahauts

You know who needs electric cars more than anyone? China.... Just sayin that what we do hear has to translate to other countries or it's all wasted in terms of helping the planet because we aren't a blip by comparison to what China is doing to the planet right now.


----------



## Tom Robertson

inkahauts said:


> You know who needs electric cars more than anyone? China.... Just sayin that what we do hear has to translate to other countries or it's all wasted in terms of helping the planet because we aren't a blip by comparison to what China is doing to the planet right now.


Excellent point. Another one used to be (and perhaps still is?) Brazil.

Though one aspect is the "will is play in Peoria?" If the US makes them right, we can sell them to China and help there too.

Peace,
Tom


----------



## dennisj00

James Long said:


> The average for all "light duty vehicles" on the road has been just over 21 MPG for about a decade.
> The average for model year 2014 passenger cars was 36.4 MPG.
> (Your Leaf would probably be better compared against a new car than every car on the road.)
> 
> BTW: In 1980 the average light duty vehicle on the road was 14 MPG with 24.3 MPG for 1980 model year cars. We (as a country) are showing improvement.
> 
> Do you burn more electricity on cold days to keep yourself warm in your Leaf? Redirecting engine heat helps keep my vehicle warm in some pretty severe temperatures.


The 2014 Leaf had an EPA rating of 114 MPG. It has a heat pump / air conditioner that reduces range by about 5% if run for the entire trip. It also has seat heaters -front and back - and steering wheel heaters that use much less and do well alone in milder temps.

Anything below freezing also affects range so your neck of the woods isn't the target market. Years ago I was in Wisconsin in the winter and cars were plugged in to block heaters and started to run while we ate breakfast to warm up!


----------



## James Long

That is better that the Tesla ... although the Tesla has a better range. Trade offs.

In my case I like to go on long drives on the weekends so I'd need a hybrid or reasonably located charging stations. Charging stations are becoming more available.


The "long drives" and vacations are why I would consider an assisted driving vehicle. An "autonomous" car that takes me from point A to point B would be useless for most of my trips. It could get me to the next city I felt like going to and it could get be home when I'm done wandering but too much of my driving is unplanned.

The interface for an unplanned trip would be interesting. Perhaps something like MapQuest with a "you are here" and "you want to go here" and the ability to select alternate routes or pull the route over to another road while enroute.


----------



## inkahauts

One giant gain should would hopefully be no more drunk drivers.


----------



## phrelin

This morning in an article with a somewhat misleading headline *Feds Find Fiat Chrysler gear shifters can confuse drivers*, misleading because apparently there are other manufacturers with problem electronic controls. In the final paragraphs we get to the crux of the evolving problem:



> "The shift knob is a real problem," wrote another driver from Enumclaw, Washington, who reported two unintentional roll-away incidents in a 2015 Grand Cherokee. "I am not a complainer, however this is a major safety issue. It terrifies me to drive this vehicle."
> 
> Fiat Chrysler is not alone with the tricky shifters. Fisher says BMW and Mercedes-Benz have similar gearshifts. He said the government has a thin line to walk between stifling innovation and keeping people safe.
> 
> "I think the best thing for consumers isn't that legislation comes," he said. "The best thing is that automakers really do not start adding features that are really confusing to people and cause accidents."


I'm kind of inclined to think that when electronic operating controls replace mechanical controls, the new controls should be design and located so that folks don't try to follow their old habits. Perhaps in this case electronically "changing gears" should be accomplished through some type of safely-located button or switch. After all, gear shift levers were originally designed when you had to change gears mechanically - when changing gears literally meant moving a gear with teeth in it in the transmission via that lever, "shifting" its location. Over the years those gear shift levers were moved by manufacturers from the center floor to the steering wheel column then back to the center console.

In any event, if the new electronic systems are not going to be intuitively familiar, then they need to be unfamiliar requiring us to relearn elements of the operation of the vehicle.


----------



## phrelin

The _Silicon Valley Mercury News_ reported today:



> The federal government's highway safety agency has told Google that its computers can be considered the driver of the self-driving cars the company is testing, in what appears to be a boost for the Silicon Valley company's longtime efforts after a recent setback in California.
> 
> The NHTSA's determination comes after the California DMV's proposed rules, unveiled in December, included a requirement that a licensed driver must be behind the wheel of self-driving cars.
> 
> Google and other players and advocates of autonomous vehicles have railed against those rules. A coalition of tech groups is pushing the state - which is still considering public comment on the issue - to reconsider the regulations, which are being criticized as anti-innovation.


in other not so new news, it was reported by Center for Responsive Politics in 2012 that among the organizations that bundled together many individual contributions for candidates for office to avoid limits, the second largest corporate organization donation for the Obama campaign came from Google, 1.4% lower than Microsoft. Not that this in any way supports my cynical view expressed in post 362 above nor was there any prescience on my part....


----------



## djlong

Well, I'll be doing my part on March 31st. Tesla has announced that they will unveil their Model 3 - the car 'for the rest of us' - on that date and you can start putting down $1,000 reservations. The car is "on schedule" to hit the road at the end of next year (2017). On 3/31, you can put your reservation down in person at a Tesla store. On 4/1 (yes, I know the significance of that date), you will be able to do it online.

What is the Model 3? It's the BMW 3-series-sized, $35,000 (before tax incentives) sedan that is supposed to bring Tesla to the mainstream. It's been in their business plan since the get go - each car they produce helping to finance the development of the next one - from the Roadster to the Model S to the Model X and now the Model 3.

I've been waiting for this a long time. I'm hoping My '02 Camry with 267,000 miles on it lasts long enough to allow this to be my next car. I've already started saving up money. The base car is supposed to have "over 200 real world miles" on a single charge. Free access to their Supercharger high-speed charging network for long-distance travel for life. They said there will be no "Signature Series" cars, like in their previous debuts, although the first cars out of the factory will be "heavily optioned".

My car will have many features that, I think, will be optional. 4WD (two electric motors), as big a battery as I can get, Autopilot hardware, tech package, sound package and maybe a few more things.

I'm looking forward to buy NO MORE GAS. No more inhaling those fumes every fillup. No more oil changes. No more replacing catalytic converters at $1000/pop (done 3 times in this car). No timing belts, water pumps, head gaskets, radiators, starters, alternators, sparkplugs, distributor caps or plug wires to replace (all of which I've had to have done at one time or another). No more exhaust systems to replace because I'm not passing the emissions test. My brakes will last AT LEAST 4 times longer than normal because the car will use regenerative braking to help recharge the battery, only occasionally having to depend on the brake rotors and pads to stop the car.

The only fluid to really check is the windshield washer fluid. The only real remaining parts to watch out for is the suspension system.

Oh yes.. I'm looking forward to this.


----------



## dennisj00

I'm in, unless I've finagled an S by then!


----------



## Tom Robertson

djlong said:


> Well, I'll be doing my part on March 31st. Tesla has announced that they will unveil their Model 3 - the car 'for the rest of us' - on that date and you can start putting down $1,000 reservations. The car is "on schedule" to hit the road at the end of next year (2017). On 3/31, you can put your reservation down in person at a Tesla store. On 4/1 (yes, I know the significance of that date), you will be able to do it online.
> 
> What is the Model 3? It's the BMW 3-series-sized, $35,000 (before tax incentives) sedan that is supposed to bring Tesla to the mainstream. It's been in their business plan since the get go - each car they produce helping to finance the development of the next one - from the Roadster to the Model S to the Model X and now the Model 3.
> 
> I've been waiting for this a long time. I'm hoping My '02 Camry with 267,000 miles on it lasts long enough to allow this to be my next car. I've already started saving up money. The base car is supposed to have "over 200 real world miles" on a single charge. Free access to their Supercharger high-speed charging network for long-distance travel for life. They said there will be no "Signature Series" cars, like in their previous debuts, although the first cars out of the factory will be "heavily optioned".
> 
> My car will have many features that, I think, will be optional. 4WD (two electric motors), as big a battery as I can get, Autopilot hardware, tech package, sound package and maybe a few more things.
> 
> I'm looking forward to buy NO MORE GAS. No more inhaling those fumes every fillup. No more oil changes. No more replacing catalytic converters at $1000/pop (done 3 times in this car). No timing belts, water pumps, head gaskets, radiators, starters, alternators, sparkplugs, distributor caps or plug wires to replace (all of which I've had to have done at one time or another). No more exhaust systems to replace because I'm not passing the emissions test. My brakes will last AT LEAST 4 times longer than normal because the car will use regenerative braking to help recharge the battery, only occasionally having to depend on the brake rotors and pads to stop the car.
> 
> The only fluid to really check is the windshield washer fluid. The only real remaining parts to watch out for is the suspension system.
> 
> Oh yes.. I'm looking forward to this.




May you be even more pleased when you actually get it.

Peace,
Tom


----------



## dpeters11

Hopefully those incentives don't go away. I think the biggest issue is, most people have a short memory. Gas is very cheap right now.

I know for you and others that's not the main thing but I can see more wanting actual gas guzzlers.


----------



## dpeters11

dennisj00 said:


> I'm in, unless I've finagled an S by then!


----------



## dennisj00

dpeters11 said:


>


I can afford that one!! Wonder what the range is?


----------



## dennisj00

dpeters11 said:


> Hopefully those incentives don't go away. I think the biggest issue is, most people have a short memory. Gas is very cheap right now.
> 
> I know for you and others that's not the main thing but I can see more wanting actual gas guzzlers.


There's a very disturbing 'Explorer' on NatGo that I just got to yesterday . . . recorded in November. 'Bill Nye's Global Meltdown' and I can't find any replays / streaming. . . but he interviews an author (I'll leave out the details) that thinks we'll be extinct by 2030. . .
and Bill says buy an EV.

One stat from the program. . . it takes 2 barrels of oil to process 3 barrels of the Canadian sand stuff. Does that make sense?


----------



## Drucifer

I wonder when the first trailer truck train will take to the highway?

And I wonder how long will the individual state governments allow them to be? 20 trailers? 50?

I expect they will have at least one engineer-type human driver and a conductor-type driver.


----------



## inkahauts

Drucifer said:


> I wonder when the first trailer truck train will take to the highway?
> 
> And I wonder how long will the individual state governments allow them to be? 20 trailers? 50?
> 
> I expect they will have at least one engineer-type human driver and a conductor-type driver.


Never. That's not going to happen. Not in California anyway. And doubtful you'll ever see that elsewhere either.


----------



## Drucifer

inkahauts said:


> Never. That's not going to happen. Not in California anyway. And doubtful you'll ever see that elsewhere either.


Haven't you learn to never say never.

You do realize these will be smart and will open to allow a vehicle to enter the highway in the middle of the trailer train.


----------



## James Long

I thought you were referring to currently operating road trains, not a future plan.

https://en.wikipedia.org/wiki/Road_train


----------



## inkahauts

Drucifer said:


> Haven't you learn to never say never.
> 
> You do realize these will be smart and will open to allow a vehicle to enter the highway in the middle of the trailer train.


Nope. You don't seem to understand why they aren't allowed in California. It's not allowed first and foremost because they don't physically fit here in our urban cities. You can't turn around a three trailer rig. Heck we don't even have hardly any two trailer rigs for most things because they don't fit well. That's a choice by the carriers not even law. Roads are not designed for them at all. Add in the inability for them to maneuver in the quarters we have and it's never going to happen. That's why they won't show up here.

And if it can open and split in two in the middle of a freeway at 65 mph it's not a train. That suggests all cars are under power.

That leads me to....

I'd suggest thinking in a totally new direction. Think of a truck that has no cab at all. The entire length is the trailer and all the mechanics are under neither it. Nothing in front nothing in back. That'd be far more efficient and easier to maneuver as well. Now that I see coming and it'd be great. They could also route then to go at the beach times of day and such to minimize traffic and such. Just think load it up during the day. It travels to its destination at 3am. Then gets unloaded during the day and sent back that night again. The reason we don't do that as much now is cost of man power of running the trucks at such spread out times. But if the trucks where all self power then you could have a larger fleet and have them driving while traffic is lightest and loaded and unloaded when its heaviest unlike today's trucks which are often done in the opposite fashion.

And top that off with a solar roof and electric motors for all the short haul around town versions.


----------



## Drucifer

Interstate. No one would run a train in a city.


----------



## Tom Robertson

Drucifer said:


> Interstate. No one would run a train in a city.


Therein lies one of the problems. It eliminates everything east of Ohio, much of California, and the Rockies.

Nothing on the roads, even in Australia, will work as a road train that long in the US. Winds push longer trailer rigs too far off course and/or sway over. And to "split" to let other traffic thru means multiple drive engines. And if humans are in front and back, a split means another human or two for the split rig. (Or a fully autonomous arrangement.)

Now, none of this means it can't be done. Active controls in the suspension and some ability to "steer" the individual trailers when they start to blow out of lanes could enable longer and safer trains. Who knows, maybe Walmart already has them in the works. 

Peace,
Tom


----------



## James Long

The road trains that run through my state (Indiana) on I-80/90 are supported by the road design. There are transfer stations at select exits where two long trailers or three trailer loads can exit through a toll gate to a parking lot where the loads are split into sizes legal on lesser highways. These transfer lots are before the vehicles reach normal roads.

Long combination vehicles (LCVs) have their place. It is not in congested areas. But there is a workable solution.

How about running these "trains" on dedicated right-of-ways? Run 100 or so trailers connected together at speeds where they cause minimal interruption to other traffic (60 or 70 MPH should do) and instead of breaking the train stop the other traffic until the 100 trailer trains pass. The "trains" could be operated by two people ... plus controllers in dispatch centers monitoring the road to make sure there were no conflicts between trains. Much of the system could be automated.

This idea should also work with different types of trailers ... not just boxes but oil, coal and grain could be transported in connected trailers on the dedicated right of way. And one could string together cars for a "train" a couple of miles long if the cars are loaded correctly and arranged correctly.

Just like with the LCVs, there would need to be transfer stations where the trailers are taken off of the "train" and driven to their final location by a short haul driver. For special commodities such as oil, coal and grain the dedicated right-of-way could be extended from source to destination.

I think it is a workable idea ... but some may not think it is innovative enough. I believe it would work a lot better than autonomous road trains.


----------



## inkahauts

Drucifer said:


> Interstate. No one would run a train in a city.


In California that doesn't work. Almost all our interstates are in the city. We may be spread out but we don't have much in the way of giant open roads that are busy. Heck we sometimes have gridlock for almost 120 miles from Los Angeles to San Diego on the 5fwy. The logistics of it all and how are roads are set up just don't cater to those road trains.

Heck They wouldn't even be safe going up and down the grapevine or Cajone pass. So if they did hook up for the haul from southern to northerner cal they'd only be able to run about 4 hours worth. But those interstates wouldn't be safe for a road train either. They are way to small and busy for that.


----------



## inkahauts

James Long said:


> The road trains that run through my state (Indiana) on I-80/90 are supported by the road design. There are transfer stations at select exits where two long trailers or three trailer loads can exit through a toll gate to a parking lot where the loads are split into sizes legal on lesser highways. These transfer lots are before the vehicles reach normal roads.
> 
> Long combination vehicles (LCVs) have their place. It is not in congested areas. But there is a workable solution.
> 
> How about running these "trains" on dedicated right-of-ways? Run 100 or so trailers connected together at speeds where they cause minimal interruption to other traffic (60 or 70 MPH should do) and instead of breaking the train stop the other traffic until the 100 trailer trains pass. The "trains" could be operated by two people ... plus controllers in dispatch centers monitoring the road to make sure there were no conflicts between trains. Much of the system could be automated.
> 
> This idea should also work with different types of trailers ... not just boxes but oil, coal and grain could be transported in connected trailers on the dedicated right of way. And one could string together cars for a "train" a couple of miles long if the cars are loaded correctly and arranged correctly.
> 
> Just like with the LCVs, there would need to be transfer stations where the trailers are taken off of the "train" and driven to their final location by a short haul driver. For special commodities such as oil, coal and grain the dedicated right-of-way could be extended from source to destination.
> 
> I think it is a workable idea ... but some may not think it is innovative enough. I believe it would work a lot better than autonomous road trains.


I can see how that might work elsewhere but in California not so much...

Quite simply the cost of building a dedicated roadway for what gain there would be makes it economically impossible. It'd literally cost billions to build an entire new dedicated highway because there is no right of way that exists today that could simply be used for it. And what there is would still require hundreds of millions in work. Not if it's going to link up from somewhere in a city near where there are large amounts of warehousing today. The cost of purchasing the right of way alone is mind boggling.

Autonomous road trains really aren't the answer. But autonomous "trailers" could be in so many more ways. You don't need trains so much as you need efficiency. That could easily be gained with autonomous single trailers for a lot less money.


----------



## James Long

inkahauts said:


> I can see how that might work elsewhere but in California not so much...


Companies such as UP and BNSF may disagree. They are running "trains" such as I described on dedicated right of ways every day. In my part of the country the major companies are NS and CSX. We also have CN and CP based out of Canada running "trains". Not so much "pie in the sky" as proven technology.

Autonomous regional or local delivery vehicles may gain traction after the technology is proven on smaller vehicles with a lower risk if they fail. I believe that it is more likely that the "self driving" risk will be taken on interstates with a monitor driver present for when the trucks leave the highway (advanced cruise control).

But long distance "trains" are better off on the dedicated right of ways that exist today. Commonly known as railroads.


----------



## phrelin

The Washington Post offered this article *The big question about driverless cars no one seems able to answer* which explores the issue of liability associated with driverless cars. It's an interesting read.


----------



## yosoyellobo

phrelin said:


> The Washington Post offered this article *The big question about driverless cars no one seems able to answer* which explores the issue of liability associated with driverless cars. It's an interesting read.


I have always belive from the beginning that the manufactures will assume legal liability for the operation of the driverless car. That is the only way that they could convince me that these cars are safe.


----------



## James Long

yosoyellobo said:


> I have always belive from the beginning that the manufactures will assume legal liability for the operation of the driverless car. That is the only way that they could convince me that these cars are safe.


And if the manufacturers do not step up and take responsibility, in advance, I expect that will hurt their sales.


----------



## inkahauts

James Long said:


> Companies such as UP and BNSF may disagree. They are running "trains" such as I described on dedicated right of ways every day. In my part of the country the major companies are NS and CSX. We also have CN and CP based out of Canada running "trains". Not so much "pie in the sky" as proven technology.
> 
> Autonomous regional or local delivery vehicles may gain traction after the technology is proven on smaller vehicles with a lower risk if they fail. I believe that it is more likely that the "self driving" risk will be taken on interstates with a monitor driver present for when the trucks leave the highway (advanced cruise control).
> 
> But long distance "trains" are better off on the dedicated right of ways that exist today. Commonly known as railroads.


As usual you ignore why I said it might work in your state but won't here. You can't build a dedicated road here that gets you anywhere near a city center area for any reasonable amount of money. We are to spread out. That's why we use so many trains. That's a lot cheaper. They have been doing massive upgrades to our freight rail system since they already own right away for that. Heck we can't even get a fwy built that's almost out in the middle of open area because all the open areas left are considered protected lands most the time now.


----------



## inkahauts

yosoyellobo said:


> I have always belive from the beginning that the manufactures will assume legal liability for the operation of the driverless car. That is the only way that they could convince me that these cars are safe.


I think mandatory no fault insurance won't be just a California thing anymore (if it isn't already) and I think that the manufacture will figure out how to only be responsible to a small degree if their car is defective and does something it shouldn't have. Just like it works today.


----------



## James Long

inkahauts said:


> As usual you ignore why I said it might work in your state but won't here.


RAILROADS do not work in California? Really? They seem to exist there. 

My entire description of "dedicated right of ways" and the traffic that can run on those roads is a railroad. The rest of your post (beyond the attack) confirms that railroads work in California. Perhaps you should go back to my original description and laugh along with the concept instead of not paying attention to what was written.

We don't need road trains ... we have rail trains.


----------



## phrelin

The liability problem needs to have someone taking a big picture view. One thing my gut tells me is that tech corporations like Google, Apple and Tesla are under tremendous pressure to succeed this morning if not last night. They have created an environment that puts pressure on auto manufacturers. But I have this vision regarding the normal behavior of corporations, lawmakers, Californian's and their cars.

It is 2042. I see 22-year-old high school graduate Jobs Gates Smith headed to his assistant manager job at MacDonalds two miles from Google's new 573-story headquarters in his recently acquired 2027 self-driving Ford Goog lacking a steering wheel, brake pedal, etc. HIs 15-year-old Ford Goog hasn't seen the inside of a maintenance shop in 8 years, the last time the third owner had it serviced.

Unfortunately, two days ago on his way home from Slimy's Used Cars some gravel on the road was tossed up by a truck unbeknownst to Smith which severely scratched the cover of a couple of key only slightly misaligned left-side sensors. While on his way to work his car correctly sensed a bicycle moving out of the bicycle lane in front of him but doesn't correctly locate the center line because of the scratches and misalignment and thus crosses the line by a foot sideswiping a new $1.8 million 2042 Tesla Hover coming the other way causing it to hit two parked cars and a pedestrian.

Having thought about this kind of possibility back in 2019, the tech and auto industries using their billions lobbied Congress to use its Interstate Commerce authority to pass a law regarding product liability for self-driving cars. Of course the probable financial interests of individual car owners were represented by two of the 438 members of the House of Representatives and of the 100 Senators, Bernie Sanders and Elizabeth Warren. The following year the two House members were voted out of office for being anti-business while Sanders and Warren retired.

Who do you think Congress will make liable in 2019 for damage and injuries caused by a self-driving car: the car's fourth owner or the manufacturer?


----------



## Tom Robertson

phrelin said:


> The liability problem needs to have someone taking a big picture view. One thing my gut tells me is that tech corporations like Google, Apple and Tesla are under tremendous pressure to succeed this morning if not last night. They have created an environment that puts pressure on auto manufacturers. But I have this vision regarding the normal behavior of corporations, lawmakers, Californian's and their cars.
> 
> It is 2042. I see 22-year-old high school graduate Jobs Gates Smith headed to his assistant manager job at MacDonalds two miles from Google's new 573-story headquarters in his recently acquired 2027 self-driving Ford Goog lacking a steering wheel, brake pedal, etc. HIs 15-year-old Ford Goog hasn't seen the inside of a maintenance shop in 8 years, the last time the third owner had it serviced.
> 
> Unfortunately, two days ago on his way home from Slimy's Used Cars some gravel on the road was tossed up by a truck unbeknownst to Smith which severely scratched the cover of a couple of key only slightly misaligned left-side sensors. While on his way to work his car correctly sensed a bicycle moving out of the bicycle lane in front of him but doesn't correctly locate the center line because of the scratches and misalignment and thus crosses the line by a foot sideswiping a new $1.8 million 2042 Tesla Hover coming the other way causing it to hit two parked cars and a pedestrian.
> 
> Having thought about this kind of possibility back in 2019, the tech and auto industries using their billions lobbied Congress to use its Interstate Commerce authority to pass a law regarding product liability for self-driving cars. Of course the probable financial interests of individual car owners were represented by two of the 438 members of the House of Representatives and of the 100 Senators, Bernie Sanders and Elizabeth Warren. The following year the two House members were voted out of office for being anti-business while Sanders and Warren retired.
> 
> Who do you think Congress will make liable in 2019 for damage and injuries caused by a self-driving car: the car's fourth owner or the manufacturer?


Sounds like a product defect if the car can't self calibrate or sense the conflicting information from the sensors. Standard product liability rules would apply. And companies can be sued even 60 years after a product has been manufactured under some circumstances. (Happened to my grandfather's company.)

And while the car in your scenario might not have received a software upgrade for a year or perhaps even two, it likely continues to receive data updates--so it could receive any software upgrades Ford Goog would make. At some point in the 15 year history of the car, and all the other cars Ford Goog has manufactured, how many scenarios would there be that haven't been seen in generalities ever before? Sure, the might not have ever seen a scratch in that exact way, yet scratches and dinged sensors will be old hat by then. The most recent upgrade, even two years ago, would have solved "the real problem" that covers this specific situation.

Peace,
Tom


----------



## inkahauts

James Long said:


> RAILROADS do not work in California? Really? They seem to exist there.
> 
> My entire description of "dedicated right of ways" and the traffic that can run on those roads is a railroad. The rest of your post (beyond the attack) confirms that railroads work in California. Perhaps you should go back to my original description and laugh along with the concept instead of not paying attention to what was written.
> 
> We don't need road trains ... we have rail trains.


The discussion has been about road trains. And people have said they have some in other parts of the country now but smaller versions. Not my fault you tried to change the subject but wasn't clear about it. We are discussing road trains not railroads.


----------



## James Long

inkahauts said:


> The discussion has been about road trains. And people have said they have some in other parts of the country now but smaller versions. Not my fault you tried to change the subject but wasn't clear about it. We are discussing road trains not railroads.


I am sorry you failed at reading the post. If you do not understand the statement _"But long distance "trains" are better off on the dedicated right of ways that exist today. Commonly known as railroads."_ I cannot help you.

We are actually discussing self driving cars ... and we got to the "road train" (with trailers that separate on the move to allow traffic to pass through the "train") which while an interesting concept from a design perspective is less useful due to the design of the roads regular trucks operate on. I offered a simple and workable solution for the "trains" as well as returning to the self driving discussion for local/regional delivery. And I brought it all back the aspect of liability for the vehicles. If you missed any of that, please re-read the posts above.


----------



## inkahauts

Exactly. We are all discussing road trains to replace regular trucks that are currently on our freeways and you veered off topic and didn't make it clear till Your last sentence. If anything your "" lead me to believe you where talking about road trains even more than railroads since you ended saying you think railroads are better. 

Don't blame me for your troubles in being misinterpreted. 

With that said railroads really have nothing to do with making our cities roads safer with self driving trucks at all. That's what the entire point was when the concept of road trains was brought up. 

I tend to think self driving trucks could have a massive benefit in big cities like Los Angeles for the reason I mentioned earlier.


----------



## phrelin

I'm a little concerned here. My daughter is a truck driver who for more than a decade has been driving tractor-trailer rigs interstate to, from, and through California to earn a living. Road trains reduce the work available. Let's not encourage them until she is old enough to retire. :eek2:


----------



## dennisj00

No job is safe in today's economy.


----------



## James Long

inkahauts said:


> Exactly. We are all discussing road trains ...


Actually you yourself veered off to discuss train trains in a post where you apparently failed to read what you quoted. So apparently you are ripping me a new one for a "crime" you committed. 

Anyways ... back to autonomous vehicles.

Phrelin, I would not worry about autonomous "road trains" affecting your daughter. Cars are barely gaining acceptance. I doubt trucks will gain acceptance before she retires.


----------



## 4HiMarks

James Long said:


> Phrelin, I would not worry about autonomous "road trains" affecting your daughter. Cars are barely gaining acceptance. I doubt trucks will gain acceptance before she retires.


I think self-driving trucks are going to be common long before the cars are. Companies have the financial resources and incentives that private individuals don't have. Robot trucks don't fall asleep at the wheel, don't get sick, drink or do drugs, don't need to be paid overtime (or at all), and don't need retirement or health insurance benefits. They won't go on strike for better working conditions. They drive over a set route day after day. Every foot of the route can be recorded and programmed in, so the driving software only needs to worry about when something is different from what it expects, rather than trying to anticipate every possible scenario. "Just-in-time" inventory would rise to the next level, saving on storage space.


----------



## yosoyellobo

4HiMarks said:


> I think self-driving trucks are going to be common long before the cars are. Companies have the financial resources and incentives that private individuals don't have. Robot trucks don't fall asleep at the wheel, don't get sick, drink or do drugs, don't need to be paid overtime (or at all), and don't need retirement or health insurance benefits. They won't go on strike for better working conditions. They drive over a set route day after day. Every foot of the route can be recorded and programmed in, so the driving software only needs to worry about when something is different from what it expects, rather than trying to anticipate every possible scenario. "Just-in-time" inventory would rise to the next level, saving on storage space.


Have you ever seen The Terminator?


----------



## 4HiMarks

yosoyellobo said:


> Have you ever seen The Terminator?


Also Colossus: The Forbin Project; Demon Seed; Westworld; 2001: A Space Odyssey; I, Robot; Ex Machina, and at least a dozen more I can't recall at the moment. But also The Bicentennial Man.

But seriously, I predict that within 20 years, it will be illegal to drive a car manually, except on private property, race tracks, or in specially designated preserves, where you sign a waiver before entering.

First you will get a discount on your auto insurance for having some self-drive (=accident avoidance) features, like they do now for anti-theft devices and air bags.

Next you will need one to use the HOT "Express" lanes to enable better traffic management and on-time guarantees. This will prove to be so popular, self-drive lanes will be added or designated on all commuter routes.

Shortly thereafter, it will extend to all lanes of all "controlled-access highways" (e.g. Interstates), although a few states will stubbornly refuse to go along on state highways. But when highway fatalities plummet, they will be forced to implement it too. Then before you know it, all public streets and roads will require it.

My GF just bought a 2016 Honda. By the time it is paid off and I am ready to replace my car, I am certain it will have self-driving capability. There are toddlers running around today who will probably never get a driver's license or learn to drive. I just renewed my license and they are increasing the registration period to 8 years. So I won't have to renew again until 2024, by which time I will be 65, and may never need to drive again, if I don't want to.


----------



## Tom Robertson

4HiMarks said:


> Also Colossus: The Forbin Project; Demon Seed; Westworld; 2001: A Space Odyssey; I, Robot; Ex Machina, and at least a dozen more I can't recall at the moment. But also The Bicentennial Man.
> 
> But seriously, I predict that within 20 years, it will be illegal to drive a car manually, except on private property, race tracks, or in specially designated preserves, where you sign a waiver before entering.
> 
> First you will get a discount on your auto insurance for having some self-drive (=accident avoidance) features, like they do now for anti-theft devices and air bags.
> 
> Next you will need one to use the HOT "Express" lanes to enable better traffic management and on-time guarantees. This will prove to be so popular, self-drive lanes will be added or designated on all commuter routes.
> 
> Shortly thereafter, it will extend to all lanes of all "controlled-access highways" (e.g. Interstates), although a few states will stubbornly refuse to go along on state highways. But when highway fatalities plummet, they will be forced to implement it too. Then before you know it, all public streets and roads will require it.
> 
> My GF just bought a 2016 Honda. By the time it is paid off and I am ready to replace my car, I am certain it will have self-driving capability. There are toddlers running around today who will probably never get a driver's license or learn to drive. I just renewed my license and they are increasing the registration period to 8 years. So I won't have to renew again until 2024, by which time I will be 65, and may never need to drive again, if I don't want to.


One of the key Google leaders hopes his son never has to get a drivers license--he's eligable in about 3.5 years. 

I agree with every aspect of your timeline except I think it will be a bit longer than 20 years before computer driven cars are required everywhere in the US. While I do see that day coming (and love your description of how it will happen), I think too many 2016 Honda's and Cameros will still be around that can't be converted to fully autonomous. I have a 21 year old Camero that might not be driving in 20 more years--then again it might. 

Peace,
Tom


----------



## James Long

Tom Robertson said:


> One of the key Google leaders hopes his son never has to get a drivers license--he's eligable in about 3.5 years.


He could do that now if he moved to a less car dependent city. Many get a license for ID purposes or opt out of the license and just get an ID card. One can do that with a good network of public transit supplemented with car services (friends, taxis, ubers)


----------



## Tom Robertson

James Long said:


> He could do that now if he moved to a less car dependent city. Many get a license for ID purposes or opt out of the license and just get an ID card. One can do that with a good network of public transit supplemented with car services (friends, taxis, ubers)


Indeed, but wouldn't that be intentionally missing the whole point? Does one really think everyone would give up their cars to public transportation?

So perhaps we'll go back to topic--which would be autonomous cars. 

Peace,
Tom


----------



## James Long

You are reading more than what was written. I am not suggesting "everyone" give up their cars - only that it is already possible to live without a driver's license in certain cities. One does not need a Google Auto Auto for that.

If anything I am against people giving up their cars and their ability to drive. But I live in an area with limited public transit and few decent jobs where a drivers licenses and reliable transportation are not part of the interview process.

It is also an area where Google faces challenges such as speed limits greater than 35, poorly or unmarked roads, gravel roads and snow. Challenges I doubt will be mastered in 3 1/2 years.


----------



## Tom Robertson

James Long said:


> You are reading more than what was written. I am not suggesting "everyone" give up their cars - only that it is already possible to live without a driver's license in certain cities. One does not need a Google Auto Auto for that.
> 
> If anything I am against people giving up their cars and their ability to drive. But I live in an area with limited public transit and few decent jobs where a drivers licenses and reliable transportation are not part of the interview process.
> 
> It is also an area where Google faces challenges such as speed limits greater than 35, poorly or unmarked roads, gravel roads and snow. Challenges I doubt will be mastered in 3 1/2 years.


I think we agree upon more than we disagree. And the disagreements are more a matter of degree than out and out disagreement.

I don't see anyone being forced to give up driving for a long time. I do see many people voluntarily giving up most of their driving. 
Where do you put something like people semi-voluntarily giving up driving to save on insurance? I think my thinking at that time would depend on how aggressive the pressure is to give up driving without making it fully mandatory. 
I also have concerns about snow and gravel in the next 3ish years--yet Ford is working on it in Mcity. Perhaps they will crack that sooner than I thought?
Though the Ford answer is hyper-mapping an area. Will the cars be able to hyper map for themselves during good driving conditions? 
I agree with your statements that cars without some steering mechanism won't sell to individuals initially. We'll need a generation of self driving cars before people will want to give up all control.

Here is an interesting milestone I bet we both can appreciate--when a self driving car finds itself stuck in the snow and yet can rock itself out faster and better than you or I can. 

Speed limits greater than 35 are already accomplished actually--in highway conditions. So now they need to move beyond California's golf cart rules into faster city driving. I don't see 35 as a magic number for the technology itself. A simple milestone of achievement not a technological breakthrough of its own.

Peace,
Tom


----------



## Drucifer

Tom Robertson said:


> Indeed, but wouldn't that be intentionally missing the whole point? Does one really think everyone would give up their cars to public transportation?
> 
> *So perhaps we'll go back to topic--which would be autonomous cars*.
> 
> Peace,
> Tom


TS here.

I purposely wrote vehicle and not automobile because I strongly suspect the big freight companies are going to be all over automation because the driver is the weakest link in their business. BTW, they're the ones paying the taxes on the Interstates.


----------



## Tom Robertson

Drucifer said:


> TS here.
> 
> I purposely wrote vehicle and not automobile because I strongly suspect the big freight companies are going to be all over automation because the driver is the weakest link in their business. BTW, they're the ones paying the taxes on the Interstates.


LOL <at myself> You caught me. Mea Culpa,

I think you are absolutely correct about trucks. Volvo clearly has that in mind. One of these days I'll chat with some truckers I know--though I'm not sure I really want to hear their colorful feelings...

Peace,
Tom


----------



## James Long

BTW: The 35 MPH comes from the California limit on "golf cart" cars (25 MPH on a max 35 MPH road). And yes, other developers seem to take the opposite tact and introduce auto pilot features without permission. Who gave Tesla permission to introduce auto lane change and driverless parking? Their tact has been "who told us not to do that"?

I feel that we are living in a bubble. So far so good on developing autonomous vehicles ... but hold your breath because the bubble will eventually pop. A failure of a vehicle will be the cause of an accident. And while the "always blame human error" mantra will follow such an incident there is human involvement throughout the process of designing, building and maintaining vehicles.

A small car has the potential to cause damage. A small truck can cause more damage. A large truck carries more risk. In a world run by actuaries putting a dollar sign on the risk is a limit. Who wants to take responsibility for larger vehicles? When an unmanned truck hits someone will the courts accept a driverless vehicle?

The industry has come a long way ... there is a long way to go.


----------



## Tom Robertson

James Long said:


> BTW: The 35 MPH comes from the California limit on "golf cart" cars (25 MPH on a max 35 MPH road). And yes, other developers seem to take the opposite tact and introduce auto pilot features without permission. Who gave Tesla permission to introduce auto lane change and driverless parking? Their tact has been "who told us not to do that"?
> 
> I feel that we are living in a bubble. So far so good on developing autonomous vehicles ... but hold your breath because the bubble will eventually pop. A failure of a vehicle will be the cause of an accident. And while the "always blame human error" mantra will follow such an incident there is human involvement throughout the process of designing, building and maintaining vehicles.
> 
> A small car has the potential to cause damage. A small truck can cause more damage. A large truck carries more risk. In a world run by actuaries putting a dollar sign on the risk is a limit. Who wants to take responsibility for larger vehicles? When an unmanned truck hits someone will the courts accept a driverless vehicle?
> 
> The industry has come a long way ... there is a long way to go.


What is the fear about an autonomous vehicle accident? Is it somehow more horrendous than a human driver accident? Is there greater liability? Is the concern really about to whom to assign the blame? Or is it some other fear?

Vehicles breakdown. Sometimes because of poor maintenance. Is a death from such better or worse than a driver driving when he/she shouldn't?

Tires sometimes get dust caught in the manufacture of the treads and explode. Is a death from those causes better?

My point is risk is measured and managed. Nothing will be perfect. But each incremental step that reduces the risk, at a reasonable cost, is a good thing. It lowers our insurance rates (or at least slows the rate of increases.) What is the real fear of adopting new technology? (My guess is loss of control--that is a common fear for people.)

Peace,
Tom


----------



## phrelin

Tom Robertson said:


> Nothing will be perfect. But each incremental step that reduces the risk, at a reasonable cost, is a good thing.


Who could argue with that particularly with an emphasis on "incremental' so we get it right.


----------



## James Long

phrelin said:


> Who could argue with that particularly with an emphasis on "incremental' so we get it right.


Incremental is the key. Google has racked up a lot of miles but every day humans drive where Google refuses to go and humans drive in weather that Google has not been able to master. Their cars are learning ... but there are plenty of milestones left to pass.


----------



## Tom Robertson

James Long said:


> Incremental is the key. Google has racked up a lot of miles but every day humans drive where Google refuses to go and humans drive in weather that Google has not been able to master. Their cars are learning ... but there are plenty of milestones left to pass.


Seems to me the only refusals by Google are to hurry the process, to abandon safety, or to give up on the dream. They aren't _refusing_ to drive anywhere, they are taking things along a progression from milestone to milestone. Isn't that exactly how we'd want a company to proceed?

Peace,
Tom


----------



## inkahauts

James Long said:


> Actually you yourself veered off to discuss train trains in a post where you apparently failed to read what you quoted. So apparently you are ripping me a new one for a "crime" you committed.
> 
> Anyways ... back to autonomous vehicles.
> 
> Phrelin, I would not worry about autonomous "road trains" affecting your daughter. Cars are barely gaining acceptance. I doubt trucks will gain acceptance before she retires.


I didn't fail to read. You failed to be clear. You should stop. I won't stop defending myself from your thinly veiled attempt to discredit my opinions.

Phrelin I wouldn't worry either. Id guess anything like this would probably have companies manning control stations to track their trucks at home bases. Or even still send a "driver" along with them.

But I still think the real solution is not actual road trains myself but trucks with no cabs. But that would cause other jobs to be created if that happened. Which is what a lot of technology changes do. I akin it more to planes taking over for ships to travel across the oceans. Jobs moved to pilots of airplanes and air traffic controllers etc instead of as many for ships.


----------



## inkahauts

Drucifer said:


> TS here.
> 
> I purposely wrote vehicle and not automobile because I strongly suspect the big freight companies are going to be all over automation because the driver is the weakest link in their business. BTW, they're the ones paying the taxes on the Interstates.


In California They aren't the only ones. We pay a ton of money in our gas taxes to take care of our roads. Over 18 cents a gallon.


----------



## inkahauts

4HiMarks said:


> Also Colossus: The Forbin Project; Demon Seed; Westworld; 2001: A Space Odyssey; I, Robot; Ex Machina, and at least a dozen more I can't recall at the moment. But also The Bicentennial Man.
> 
> But seriously, I predict that within 20 years, it will be illegal to drive a car manually, except on private property, race tracks, or in specially designated preserves, where you sign a waiver before entering.
> 
> First you will get a discount on your auto insurance for having some self-drive (=accident avoidance) features, like they do now for anti-theft devices and air bags.
> 
> Next you will need one to use the HOT "Express" lanes to enable better traffic management and on-time guarantees. This will prove to be so popular, self-drive lanes will be added or designated on all commuter routes.
> 
> Shortly thereafter, it will extend to all lanes of all "controlled-access highways" (e.g. Interstates), although a few states will stubbornly refuse to go along on state highways. But when highway fatalities plummet, they will be forced to implement it too. Then before you know it, all public streets and roads will require it.
> 
> My GF just bought a 2016 Honda. By the time it is paid off and I am ready to replace my car, I am certain it will have self-driving capability. There are toddlers running around today who will probably never get a driver's license or learn to drive. I just renewed my license and they are increasing the registration period to 8 years. So I won't have to renew again until 2024, by which time I will be 65, and may never need to drive again, if I don't want to.


I think 20 years is way to fast. Maybe 40. Maybe. More likely 60 IMHO. The reason is not because the tech can't be there and the laws can't but rather building that many new cars with the tech once it's mature enough to be a standard and everyone being able to buy them.

Heck No business would want to replace all their trucks overnight. That isn't how accounting works for them on those without losing tons of money.

But UPS and fedex? Oh that will come quick. They won't even have to turn the engine off and such it'll save an hour every day of driving for deliveries.

But people who can't afford new cars won't be getting them until there's good used ones available. And that takes years for them to be widely available.

I am not sure we will ever really see driving a car illegal. I kind of doubt if. What Id prefer to see is a much harder to pass drivers test to get one once self driving cars are the norm.


----------



## James Long

The only opinion of yours I disagree with is that I didn't read your posts. I read them.

Otherwise, you seem to be agreeing with me on most of the points involving the viability of multi-trailer "road trains". We seem to come to the same conclusion that regular trains on their dedicated ROWs work well for that service.

I can imagine Walmart automating the runs from their distribution centers to their stores. Both ends are staffed by their employees. The driver doesn't do much more than pilot the vehicle. The driverless truck could work on dedicated runs where both ends are staffed to handle the load.

Where complete driverlessness would not help would be where the driver is responsible for unloading the truck. Whether this is the FedEx or UPS driver taking packages inside a business or to the porch of a home or a "less than load" (LTL) driver making sure each customer gets their delivery (all of it and not someone else's stuff). LTL drivers also work with the customers on where the trucks should be driven and parked. What is the customer supposed to do when a driverless truck pulls on to the property? Call a toll free number and tell a remote person that they need the truck moved over a couple of feet from where the computer parked? Driverless isn't as big of a benefit if a human has to go along to handle the paperwork and customer service. Driver assist would be helpful.


----------



## phrelin

According to *Google reprograms self-driving car after it hits public bus in California*:



> Google modified its vehicle software in the aftermath of the crash between its self-driving car and bus and admitted to partial responsibility.


The error was a relatively simple one:



> The vehicle and the test driver "believed the bus would slow or allow the Google (autonomous vehicle) to continue," it said.


Or to put it another way, both the driver and the car's software "assumed" something. Unfortunately, we all do that and sometimes it results in accidents.

From the *AP report*:



> Google wrote that its car was trying to get around some sandbags on a street in Mountain View, California, when its left front struck the right side of the bus.
> 
> The car's test driver - who under state law must be in the front seat to grab the wheel when needed - thought the bus would yield and did not have control when the collision happened, according to Google's report.


----------



## inkahauts

I'd like to see an animation of that. From what is said about where they hit it doesn't make sense about thinking the bus would yield and hello, you never start moving over before the lane next to you is clear anyway.


----------



## Tom Robertson

inkahauts said:


> I'd like to see an animation of that. From what is said about where they hit it doesn't make sense about thinking the bus would yield and hello, you never start moving over before the lane next to you is clear anyway.


In a couple days I think Google will put the whole report in their monthly report. I too am slightly confused by the descriptions I've read in the articles.

Peace,
Tom


----------



## phrelin

inkahauts said:


> I'd like to see an animation of that. From what is said about where they hit it doesn't make sense about thinking the bus would yield and hello, you never start moving over before the lane next to you is clear anyway.





Tom Robertson said:


> In a couple days I think Google will put the whole report in their monthly report. I too am slightly confused by the descriptions I've read in the articles.
> 
> Peace,
> Tom


I finally found an *explanation in the Washington Post* that makes sense:



> Google's car was attempting to make a right-hand turn on red, and moved to the right side of a wide lane on El Camino Real to pass traffic stopped at the light. But as Google's car neared the intersection of Castro Street, its path was blocked by sandbags around a storm drain, according to a report Google filed with the California DMV.
> 
> Google's car tried to go around the sandbags by cutting into the line of vehicles on the left side of the lane. Instead, it struck a metal piece connecting the two halves of an accordion-style bus, according to a Santa Clara Valley Transportation Authority spokeswoman. Google said its car was going less than 2 mph and the bus was moving at 15 mph. Both parties said there were no injuries and described the crash as minor.


There is a video that shows the damage to the car.


----------



## phrelin

As a followup to the accident, _The Atlantic_ offered this thoughtful piece *Can Google's Driverless Car Project Survive a Fatal Accident?* which discusses the history of cars (the first fatality was in 1899!) and leaves us with:



> Self-driving cars could dramatically reduce the number of deaths yet again. If, as many researchers believe, self-driving cars end up shrinking traffic fatalities by up to 90 percent this century, driverless cars could save as many lives as anti-smoking efforts have.
> 
> But none of the promise of this technology takes away from the fact that autonomous vehicles still face a thicket of difficult ethical and regulatory uncertainties. One of the biggest questions of all is social in nature: How will the public accept a car that is 100 percent autonomous but not 100 percent safe-even if it's far safer than a human-driven alternative?
> 
> It isn't Google's recent accident, but a more serious one that will reveal the answer.


The writer did, of course, focus on cars. In fact, we humans have embraced many labor/time saving machines that occasionally were the cause of deaths. But because we are so "instant headline news" oriented today and because of that seemingly more emotionally interactive about everything, it will be interesting to see what happens when a self-driving car is at least partially responsible for a fatal accident.

Right now I doubt it will be one of Google's test cars even though they are putting in far more miles on public streets than any other. I suspect it will be a car or truck operated by a "beta tester-owner" from a more aggressive corporation such as a Tesla or Volvo. But anything is possible.


----------



## inkahauts

The only way I see us eliminating fatal accidents in cars is to go back to the horse and or horse drawn buggy...

Sounds like the car misjudged slight how much room there was or it was just stupid because it hit in the middle of the bus!


----------



## Drucifer

Well, the accident would not have occurred if the bus was smart and could communicate with other smart vehicles.

And I suspect buses with daily routines will be the first to get upgraded to 'smart.'


----------



## dennisj00

Buses in Charlotte don't even have fare counters that work!

http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwj3w6-rx6DLAhVIVh4KHcbYCBsQFggkMAE&url=http%3A%2F%2Fwww.charlotteobserver.com%2Fnews%2Flocal%2Farticle44912448.html&usg=AFQjCNEbtB3FUKEJ6IgMp2r7JBKvvuIyxQ&sig2=fjaovp7w0s9lY1SJDLT4Gw


----------



## phrelin

When the above cover of the latest e-issue of TIME magazine first appeared on my Kindle Fire, I felt a certain level of indignant. While I always understood that a drivers license was not a right but a privilege, I have prized that privilege for 55 years. But then I read the article *Why You Shouldn't Be Allowed to Drive* and thought, oh, yeah, I get it. It's a long article but it's worth reading. (If you can't read it at the site here's a pdf.)

I can't quite figure out why our 20-year-old granddaughter who got her *m*aternal grandmother's car after her grandmother died* a couple of years ago just won't get a driver's license and let's the car sit there. But in reading the article, I have to admit that the I-love-my-car culture is so 20th Century. And being someone who was born in the first half of the 20th Century, I have to admit I'm so 20th Century in most ways.

The ultimate irony I guess is that if I live long enough, I may have to own a self-driving car to remain independent for a few extra years. So get on with it, Google.

*My wife is her *p*aternal grandmother and still very much alive.


----------



## Drucifer

phrelin said:


> When the above cover of the latest e-issue of TIME magazine first appeared on my Kindle Fire, I felt a certain level of indignant. While I always understood that a driver's license was not a right but a privilege, [snip]


A privilege that is too easy to get judging by all the apparent nitwits now on the road with licenses. Hopefully, in the future, it will be a lot harder to get and keep.


----------



## James Long

> Google's car was attempting to make a right-hand turn on red, and moved to the right side of a wide lane on El Camino Real to pass traffic stopped at the light. But as Google's car neared the intersection of Castro Street, its path was blocked by sandbags around a storm drain, according to a report Google filed with the California DMV.
> 
> Google's car tried to go around the sandbags by cutting into the line of vehicles on the left side of the lane. Instead, it struck a metal piece connecting the two halves of an accordion-style bus, according to a Santa Clara Valley Transportation Authority spokeswoman. Google said its car was going less than 2 mph and the bus was moving at 15 mph. Both parties said there were no injuries and described the crash as minor.


It sounds like the car considered the "wide lane" as two lanes when wanting to pass on the right but lost lane control when it came to an obstruction.

Will Google be able to release the car's view of the accident? Showing what the car "saw" that the program considered to be a safe time to merge left?


----------



## Tom Robertson

Google's Chris Urmson and automobile executives will speak before Congress Tuesday, March 18: http://news.yahoo.com/congress-hear-head-google-self-driving-car-project-173155574--finance.html

Wonder if CSPAN will carry it.

Peace,
Tom


----------



## Tom Robertson

In the category of "well, duh" is the report from NHTSA: http://news.yahoo.com/u-says-legal-hurdles-remain-deployment-self-driving-173043039--finance.html

In the category of "if you can't beat them, buy them" is Chevrolet's purchase: https://www.yahoo.com/tech/gm-cruises-further-autonomous-vehicle-development-latest-acquisition-165613415.html

Peace,
Tom


----------



## phrelin

From _The Verge_ *Google's bus crash is changing the conversation around self-driving cars*:



> Even with high-flying names like "Autonomous Vehicles Will Remake Cities" and "Autonomous Cars Will Make Us Better Humans," the tone at SXSW's many forward-looking talks has been more subdued. Self-driving cars may be on the road today - in pilot programs in various sunny, fine-weathered locales. But the most optimistic of technologists are starting to acknowledge that the problem very well may take decades to crack.
> 
> "If you read the papers, you're going to see that it's maybe three years, maybe 30 years [before self-driving cars arrive]," Chris Urmson, the director of Google's self-driving car project, said in a high-profile talk on Friday at the Austin Convention Center. "I think it's a bit of both. This technology is almost certainly going to come out incrementally." It's a step below the rhetoric he's used in the past; in September of last year, for instance, Urmson said he hoped his 11-year-old son doesn't have to get a driver's license.
> 
> The process has already started, with Tesla's controversial Autopilot and other Advanced Driver Assistance Systems (ADAS) designed for highway driving making their way to Honda, GM, and Mercedes vehicles over the next couple years. But Urmson, a strong advocate for the transformative power of autonomous vehicles, says there's a long road ahead before the majority of cars can completely operate themselves.


As one who thinks "autonomous vehicles" is a term quite a bit over-the-top, I do like the use of the very descriptive lable_ Advanced Driver Assistance Systems (ADAS)_ which could lead to "acceptable uses" as "self-driving vehicle systems" for Urmson's 11-year-old son albeit with a driver's license plus a driver's seat with physical steering, braking, and accelerator controls - which some day might be reduced to a voice control system akin to Alexa, Cortana, and Siri. Maybe people would actually get to name their car's control system. Of course, despite the fact that various companies are doing their own thing, the driver's control of the ADAS will have to become more uniform than we're likely to see in 2020.


----------



## Drucifer

Daimler sends autonomous Truck Platoon on Stuttgart to Rotterdam road trip
http://www.gizmag.com/daimler-connected-autonomous-trucks-challenge/42631/?utm_source=Gizmag+Subscribers&utm_campaign=1ddc27f753-UA-2235360-4&utm_medium=email&utm_term=0_65b67362bd-1ddc27f753-91460513


----------



## James Long

From the baby steps department ...

*Automatic Braking Systems To Become Standard On Most U.S. Vehicles*

Some 20 carmakers have committed to making automatic emergency braking systems a standard feature on virtually all new cars sold in the U.S. by 2022, according to a new plan from the National Highway Traffic Safety Administration and the Insurance Institute for Highway Safety.

NHTSA released a list of the car companies that have committed to the system:
"Audi, BMW, FCA US LLC, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Kia, Maserati, Mazda, Mercedes-Benz, Mitsubishi Motors, Nissan, Porsche, Subaru, Tesla Motors Inc., Toyota, Volkswagen and Volvo."

http://www.npr.org/sections/thetwo-way/2016/03/17/470809148/automatic-braking-systems-to-become-standard-on-most-u-s-vehicles

One more step in assisted driving ... toward a day where (eventually) additional assisted driving features will become standard.


----------



## phrelin

James Long said:


> From the baby steps department ...
> 
> *Automatic Braking Systems To Become Standard On Most U.S. Vehicles*
> 
> Some 20 carmakers have committed to making automatic emergency braking systems a standard feature on virtually all new cars sold in the U.S. by 2022, according to a new plan from the National Highway Traffic Safety Administration and the Insurance Institute for Highway Safety.
> 
> NHTSA released a list of the car companies that have committed to the system:
> "Audi, BMW, FCA US LLC, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Kia, Maserati, Mazda, Mercedes-Benz, Mitsubishi Motors, Nissan, Porsche, Subaru, Tesla Motors Inc., Toyota, Volkswagen and Volvo."
> 
> http://www.npr.org/sections/thetwo-way/2016/03/17/470809148/automatic-braking-systems-to-become-standard-on-most-u-s-vehicles
> 
> One more step in assisted driving ... toward a day where (eventually) additional assisted driving features will become standard.


IMHO this is the feature that will determine the effectiveness of assisted driving. If it doesn't encourage "driving distracted" among the Millennials then the rest of the steps will be almost undeniable for their generation.


----------



## Drucifer

Drucifer said:


> Daimler sends autonomous Truck Platoon on Stuttgart to Rotterdam road trip
> http://www.gizmag.com/daimler-connected-autonomous-trucks-challenge/42631/?utm_source=Gizmag+Subscribers&utm_campaign=1ddc27f753-UA-2235360-4&utm_medium=email&utm_term=0_65b67362bd-1ddc27f753-91460513


I was calling it a truck train, but the Germans have now termed a parade of trucks as a *Truck Platoon*


----------



## dennisj00

http://money.cnn.com/2016/04/04/technology/george-hotz-comma-ai-andreessen-horowitz/index.html

This kid has done some pretty impressive software hacks - and I really wouldn't call him a hack.

An earlier article about his Honda conversion described his software as monitoring him drive for 30 minutes and then it took over.

It'll be an interesting 'kit' and I've added a lot of things to my cars over the years, but we'll closely watch this one!


----------



## James Long

Drucifer said:


> I was calling it a truck train, but the Germans have now termd a parade of trucks a *Truck Platoon*


They are different concepts. The platoon are individually powered and controlled vehicles driving closely together (extremely close in the linked article as they use a data connection to warn the truck behind of any sudden braking of the truck ahead). The truck train is usually connected trailers - or a "long combination vehicle".

Perhaps the "platooning" term should have been used earlier in the thread when the concept of trailers leaving the train for their own destinations was being discussed.


----------



## phrelin

dennisj00 said:


> http://money.cnn.com/2016/04/04/technology/george-hotz-comma-ai-andreessen-horowitz/index.html
> 
> This kid has done some pretty impressive software hacks - and I really wouldn't call him a hack.
> 
> An earlier article about his Honda conversion described his software as monitoring him drive for 30 minutes and then it took over.
> 
> It'll be an interesting 'kit' and I've added a lot of things to my cars over the years, but we'll closely watch this one!


Wish I could invest in this guy.


----------



## dennisj00

phrelin said:


> Wish I could invest in this guy.


I'm really surprised Musk, Google, or Apple (if they're really developing a car) hadn't picked him up. $3 million isn't much of a price these days unless there were no strings attached. I mean Google paid a billion or so for Nest!

Here's the original article I mentioned above. . . http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/


----------



## phrelin

This story *America Has the Fewest 16-Year-Old Drivers Since the 1960s* makes me think that the demand for driverless cars may be significant in 2030 when these folks are ... 30. Maybe their demand will be in the form of ordering a driverless taxi, particularly in areas where one can't make a living as a taxi driver.


----------



## inkahauts

I can't open the link for some reason but.... It's a lot harder and not as much point to getting a license at 16 anymore unless you will have your own car to take to school and back vs waiting till you are 18 or older... So many more laws. 16 year old a drivers are extremely limited as to when and where they can drive now in ca and other places I've heard...


----------



## Drucifer

__ https://twitter.com/i/web/status/727276058215604225


----------



## dennisj00

Probably not any more than now - and will be less dangerous!


----------



## phrelin

And we have this pessimistic headline from the _Silicon Valley Mercury News_ *Self-driving cars may not be proven safe for decades: report*:



> Google's self-driving cars may have driven themselves 1.5 million miles since 2009, but it could take hundreds of years for robot-car makers to prove safety at the existing testing rate, a new study said.
> 
> "Given that current traffic fatalities and injuries are rare events compared with vehicle miles traveled, we show that fully autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their safety in terms of fatalities and injuries," said the Driving to Safety report from the Rand Corporation. "Under even aggressive testing assumptions, existing fleets would take tens and sometimes hundreds of years to drive these miles - an impossible proposition if the aim is to demonstrate performance prior to releasing them for consumer use.
> 
> For the report's authors, the problem with determining robot-car safety lies in the comparatively low rate of road injuries and deaths versus miles driven. Although the 32,000 annual traffic deaths is a large number, it's small compared to the three trillion miles Americans drive every year. So to find the rate at which self-driving cars crash and cause injuries or death, which would determine whether the vehicles are safe - and safer than humans - virtually astronomical numbers of testing miles would need to be driven, according to the report.


While I understand the simple logic of this, I would also have to point out that over the millennia people rode horses many more miles and nobody waited that long to find out just how much safer that newfangled automobile might be.


----------



## inkahauts

Yeah there's a lot of problems with that theory. Way to many problems with that article to be taken seriously.


----------



## Drucifer

Vehicle can still be made a lot smarter while not be self-driven. Like making sure the driver is sober.

Self-driving should just be one feature of a smarter vehicle.


----------



## James Long

Public acceptance always takes time. People didn't transition from horses to horseless carriages overnight. They transitioned through public transportation (trolleys and interurbans) to personal vehicles.

The eventual transition to autonomous pods does not have to follow the same path or speed. But the transition does not need to be done at breakneck speed. Especially when the risk is breaking human necks.


----------



## phrelin

The real story this week is *Google And Fiat Have a Plan to Make Self-Driving Cars Totally Uncool*:



> Google is collaborating with Fiat Chrysler Automobiles to put a fleet of self-driving minivans on the road, marking the first time the tech company has worked directly with an automaker to create its autonomous vehicles.
> 
> Google, which wants to commercialize self-driving cars by 2020, said in a blog post Tuesday that it plans to double its fleet of self-driving cars for testing by adding about 100 Chrysler Pacifica Hybrid minivans. The tech company and FCA will work closely in the coming months to design the minivans to easily accommodate Google's self-driving systems, including sensors and computers.
> 
> Google says the experience will "help both teams better understand how to create a fully self-driving car that can take people from A to B with the touch of a button." Google added that the collaboration will also give it the opportunity to test a larger vehicle that could be easier for passengers to enter and exit, hinting at one of its long-stated goals to serve disabled people who are unable to drive.


As I've said before, getting older means it would be good for me to have something like this assuming I live to 2020.


----------



## 4HiMarks

To really serve the disabled, someone should approach it from the other direction - beef up motorized wheelchairs to make them self-driving, safer, and capable of highway speed.


----------



## Drucifer

4HiMarks said:


> To really serve the disabled, someone should approach it from the other direction - beef up motorized wheelchairs to make them self-driving, safer, and capable of highway speed.


But I would say extremely dangerous for the supermarket shopping aisle.

Need clean cleanup on aisle six.


----------



## James Long

Drucifer said:


> But I would say extremely dangerous for the supermarket shopping aisle.
> 
> Need clean cleanup on aisle six.


More like "cleanup on aisle six, seven and eight" after a highway speed incident in a store.


----------



## Rich

yosoyellobo said:


> I have never been able to read in a moving vehicle so I hope I could watch movies otherwise it be sleeping and listening to music.


I have that problem too, but just in cars. I have no trouble reading in trains, planes, ships (never tried in a boat) and, oddly, buses. In a car, I really get to feeling bad quickly.

Rich


----------



## Rich

Drucifer said:


> I read somewhere, that there is a tractor-trailer now being tested on real roads.


I think they're doing that in Europe.

Rich


----------



## James Long

Rich said:


> I read somewhere, that there is a tractor-trailer now being tested on real roads.
> 
> 
> 
> I think they're doing that in Europe.
Click to expand...

As linked earlier in this thread ...

The European tests linked have a human driver in the cab while automation is driving.


----------



## inkahauts

Rich said:


> I have that problem too, but just in cars. I have no trouble reading in trains, planes, ships (never tried in a boat) and, oddly, buses. In a car, I really get to feeling bad quickly.
> 
> Rich


I'm the same way Rich.


----------



## Rich

inkahauts said:


> I'm the same way Rich.


No problems in anything but a car? Why do you suppose that is?

Rich


----------



## inkahauts

Rich said:


> No problems in anything but a car? Why do you suppose that is?
> 
> Rich


For me I think it has to do with the space you are in and the peripheral vision catching the movement outside the car window with the floating that does it. I think in larger spaces your vision catches enough of the inside to stem the issue. My folks have always had a Denali xl and when I sit in the far back middle seat I can read for a few minutes without an issue. The perspective is totally different back there than the other seats. That's why I think that's what the issue is.


----------



## Tom Robertson

inkahauts said:


> For me I think it has to do with the space you are in and the peripheral vision catching the movement outside the car window with the floating that does it. I think in larger spaces your vision catches enough of the inside to stem the issue. My folks have always had a Denali xl and when I sit in the far back middle seat I can read for a few minutes without an issue. The perspective is totally different back there than the other seats. That's why I think that's what the issue is.


That is my general understanding. If the background and the body beat to a different drum, reading can exasperate the problem. So I think you nailed the key differences--field of vision.

Generally I can read in the car, except when my inner ear is a bit off from a cold or allergies. I seem to be ok on a laptop even when I start to have problems with paper.

Peace,
Tom


----------



## Rich

inkahauts said:


> For me I think it has to do with the space you are in and the peripheral vision catching the movement outside the car window with the floating that does it. I think in larger spaces your vision catches enough of the inside to stem the issue. My folks have always had a Denali xl and when I sit in the far back middle seat I can read for a few minutes without an issue. The perspective is totally different back there than the other seats. That's why I think that's what the issue is.


I kinda get that...but why doesn't that hold true on a bus? I realize the bus is a lot larger than a car, but you're pretty cramped in a bus and can't see much over the seats, yet I've never had a problem reading on a bus. Used to take a bus from Norfolk to NYC on weekends when I was in the Navy and never had a problem with nausea.

Rich


----------



## Rich

Tom Robertson said:


> That is my general understanding. If the background and the body beat to a different drum, reading can exasperate the problem. So I think you nailed the key differences--field of vision.
> 
> Generally I can read in the car, except when my inner ear is a bit off from a cold or allergies. I seem to be ok on a laptop even when I start to have problems with paper.
> 
> Peace,
> Tom


I'd be surprised if I could use a laptop in a car. Never tried it tho. Planes and trains I've used laptops and tablets on without any problems, but I can also read on them. This is starting to make me nauseous... :rolling:

Rich


----------



## Drucifer

Self-Driving Trucks May Hit the Road Before Google's Cars

https://www.technologyreview.com/s/601476/self-driving-trucks-may-hit-the-road-before-googles-cars/?utm_campaign=newsletters&utm_source=newsletter-daily-all&utm_medium=email&utm_content=20160518#/set/id/601478/


----------



## phrelin

*Google patent: Glue would stick pedestrian to self-driving car after collision*.

No this is not a joke. Seems weird, somehow.


----------



## Drucifer

UK Company Launches Insurance Policy for Autonomous Cars


----------



## yosoyellobo

Bad news for those of us who would like to see this technology someday.

https://www.washingtonpost.com/news/the-switch/wp/2016/06/30/tesla-owner-killed-in-fatal-crash-while-car-was-on-autopilot/?hpid=hp_hp-top-table-main_driverless-535pm%3Ahomepage%2Fstory


----------



## James Long

The death is unfortunate, especially when the driver killed was a supporter and promoter of the technology. It appears that he relied on it often and allowed himself to become distracted. (Not that non-autonomous car drivers do not get distracted and take their hands off the wheel or look away.)

It is a setback ... but it is also a way forward. The industry needs to make sure that this does not happen again. A white truck against a bright sky is hard to discern - even when one is paying attention to what is ahead on the road. Ok Tesla, how do you do better?

With this driver's propensity to use a dash camera and post videos of his Tesla driving I wonder if there was video of this incident? No video was mentioned but it would provide excellent evidence of just how hard it would have been to see the truck before hitting it and tell the rest of the story.


----------



## James Long

BTW: After reading far too many negative comments about the deceased on the Internet I feel it is necessary to mention who the driver was and honor his life.


> Joshua D. Brown, 40, of Canton, Ohio, formerly of Westmoreland County, died in a tragic motor vehicle accident Saturday, May 7, 2016. He was born Jan. 20, 1976, in Great Lakes, Ill. Joshua graduated from Franklin Regional High School in 1994, attended the University of New Mexico and enlisted in the Navy in 1997. Joshua became a master EOD technician and due to his determination and dedication, he achieved his aspirations to be part of the Navy SEAL teams. He dedicated 11 years to the Navy and was an honored member of the elite Naval Special Warfare Development Group (NSWDG). After his discharge, he worked for Tactical Electronics and then created his own successful technology company, Nexu Innovations Inc. Joshua is survived by his parents, Warren and Sueanne Brown, of Stow, Ohio; his sister, Amanda Lee (Jeremy Lee); six nieces and nephews; and numerous aunts, uncles and cousins. His second family in the Navy now have the watch.
> 
> See more at: http://www.legacy.com/obituaries/triblive-murrysville-star/obituary.aspx?pid=179986286#sthash.XnmnGUUN.dpuf


----------



## Nick

Meanwhile, hundreds of people were also killed that same day in
traffic crashes where the drivers were in control of their vehicles.

I am curious as to what the ultimate liability assessments will be.


----------



## dpeters11

It does seem he did get too complacent, though I can certainly understand how that could happen. I didn't watch the video of his close call, as at that point I didn't want to watch a video of a man that would later be killed in that car.

Inevitably, there would have been something happen. It would be foolish to think that there would never be an incident. There will be a point where it will be as 100% safe as possible, but only when a very high percentage of cars are autonomous and communicate with each other.


----------



## dennisj00

I went with a friend to pick up his Model X last Friday and drove it 30 or so miles and about 10 under the AP. It does remind you / actually require you to put your hands on the wheel every few minutes.

A day earlier, Tesla reported that in 1.3 Million miles under auto-pilot, air bag deployments were down 50%.

While this was a tragic loss, he obviously became too complacent.

There are way too many uTube videos of people reading the paper, playing games, etc. while using AP. As above, it was inevitable something will happen and won't be the last.

The visual display of objects around you and associated warnings / feedback work constantly - not just in AP - and are certainly helpful to any driver.


----------



## yosoyellobo

I am pretty sure that under auto pilot I would have my hands on the wheel all the time which would defeat the purpose for do it. As someone who doesn't even like to use cruise control I would most likely skip it for now. I am all in on a true autonomous car.


----------



## phrelin

Based on the explanation from Tesla...



> "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said.


...my guess is the cause was driver inattention combined with a system design flaw.

The issue is "had the driver been paying attention could he have avoided the accident?" Probably.

Unfortunately smart phones or CD players or car radios in the 1950's or a noisy kids in the back seat also distracted drivers in fatal accidents. Did having an autopilot system encourage him to not pay attention? Based on information he provided it appears the answer is yes. But an autopilot system can and will avoid many fatal accidents when those other factors distract the driver. The difficulty over the next ten years is to avoid the temptation to "quit driving" even for a few minutes.


----------



## James Long

dpeters11 said:


> It does seem he did get too complacent, though I can certainly understand how that could happen. I didn't watch the video of his close call, as at that point I didn't want to watch a video of a man that would later be killed in that car.


I watched the "close call" video yesterday. The offending vehicle slowly moved across two lanes of traffic and "Tessy" moved halfway to the shoulder to avoid getting hit. The camera view was narrow and did not show the driver nor much of the scene. Through that tunnel vision the first indication of problem was the car leaving the lane to avoid the accident other than the truck being in view earlier in the video). It would have been nice to have a wider view. I have had similar incidents to the "close call" but being alert I noticed the potential for problem before it became a swerve to avoid situation.

The "must touch the steering wheel" detection reminds me of the alerters on trains. Train engines have buttons that must be pressed every few minutes to let the train know the engineer is alert. Operating other controls also can clear the alerter. Basically the system is designed to determine if an alert is operating the train. The detection does not work very well ... and the "tap a button" becomes a reflex that does not prove the engineer was alert. Engineers have died and killed others because they lost awareness beyond tapping a button in response to noise.

I do not know how robust Tesla's detection is ... but it seems easy to be able to rest a hand on a wheel and be detected even if one is not paying attention to where the car is going.


----------



## James Long

phrelin said:


> The difficulty over the next ten years is to avoid the temptation to "quit driving" even for a few minutes.


The current state of the art requires an alert driver. When the state of the art gets to the point where alert drivers are not needed people can fold up their steering wheels and quit driving.


----------



## phrelin

James Long said:


> The current state of the art requires an alert driver. When the state of the art gets to the point where alert drivers are not needed people can fold up their steering wheels and quit driving.


I don't think I'll live long enough to see that. But I am going to take a hard look at 2017 models to see what can help me reduce risks as an aging driver. And even assist features such as automatic parallel parking are starting to look awfully good.


----------



## dpeters11

There are some reports he might have actually been watching a movie.

http://www.reuters.com/article/tesla-autopilot-dvd-idUSL1N19N12U


----------



## dennisj00

phrelin said:


> I don't think I'll live long enough to see that. But I am going to take a hard look at 2017 models to see what can help me reduce risks as an aging driver. And even assist features such as automatic parallel parking are starting to look awfully good.


That's exactly why my friend bought the X. When I did the test drive several weeks ago a car pulled out from a fast-food place and the Tesla nicely slowed and resumed following the car without me touching the brake. My sister had an accident that totaled her car a few weeks ago that the Tesla would have avoided.

The parking was amazing. Pull slowly by the perpendicular slots, a prompt appears on the screen when you stop. Touch it and it backed into one of the slots, stopped halfway, pulled ahead to center and then backed within inches to the curb.


----------



## Nick

James Long said:


> [ ... ]
> 
> I do not know how robust Tesla's detection is ... but it seems easy to be able to rest a hand on a wheel and be detected even if one is not paying attention to where the car is going.


...or, worst case scenario -- driver's sound asleep, eating sandwich, with left hand resting on steering wheel and right hand loosely holding smart phone, ready to respond to next incoming text, probably from cops following close behind with red/blue lights flashing.

Don't try to challenge me on this, James -- it _could_ happen.


----------



## James Long

Nick said:


> Don't try to challenge me on this, James -- it _could_ happen.


Do I have your permission to agree with you?


----------



## Rich

I've been seeing reports of an autonomous electric car, a Tesla, going out of control and killing somebody. I don't have a link. I think I saw it on Gizmodo. I'll try to find a link.

Rich


----------



## yosoyellobo

Post no 480 Rich in this thread.


----------



## Rich

yosoyellobo said:


> Post no 480 Rich in this thread.


Oops, missed that post. Anyhow, here's a good _*link*_ to that story. If I could get my email notifications fixed this might not happen...

Rich


----------



## James Long

Always blame someone else. Just like blaming Tesla because the car did not save the driver or the driver for not noticing that the Tesla was not avoiding the accident.


----------



## Rich

Saw an article in the NY Daily News today that said Tesla's sales have dropped since the accident and they will not reach their forecast level this year. 

Rich


----------



## Drucifer

James Long said:


> The death is unfortunate, especially when the driver killed was a supporter and promoter of the technology. It appears that he relied on it often and allowed himself to become distracted. (Not that non-autonomous car drivers do not get distracted and take their hands off the wheel or look away.)
> 
> It is a setback ... but it is also a way forward. The industry needs to make sure that this does not happen again. A white truck against a bright sky is hard to discern - even when one is paying attention to what is ahead on the road. Ok Tesla, how do you do better?
> 
> With this driver's propensity to use a dash camera and post videos of his Tesla driving I wonder if there was video of this incident? No video was mentioned but it would provide excellent evidence of just how hard it would have been to see the truck before hitting it and tell the rest of the story.


They were using cameras instead of radar.


----------



## James Long

Drucifer said:


> They were using cameras instead of radar.


I read one report that the Tesla had radar but it was programmed to ignore overhead signs and apparently read the side of the semi as a sign.


----------



## dennisj00

Rich said:


> Saw an article in the NY Daily News today that said Tesla's sales have dropped since the accident and they will not reach their forecast level this year.
> 
> Rich


Since the death was only reported in the media this week, I doubt it affected the last quarter sales. They did miss their forecast, only some 14000+ instead of forecast of 17,000. An average of 160 a day. And it also depends on who's making the forecast. Analysts or internal?

Maybe mine will arrive a week or two early.


----------



## Drucifer

James Long said:


> I read one report that the Tesla had radar but it was programmed to ignore overhead signs and apparently read the side of the semi as a sign.


Heard the white side was picked up as sky.

Sounds like all the facts haven't been fully determined.


----------



## James Long

The camera read the white panel as sky.
The radar read the side of the truck as an overhead sign.
Both statements can be true ... one does not contradict the other.

The car went under the truck and out the other side ... which proves that there was a path under the obstruction. The path under just did not have sufficient clearance for the vehicle.

Describe an overhead sign on the highway and one may describe it as a large obstruction that one can pass under (although with adequate clearance).

It appears that the automation's problem was judging the clearance. Fix that problem and move on. Demonstrate that the car knows the difference between a low hanging sign and a not high enough trailer. And move on.


----------



## phrelin

It's interesting to watch Elon Musk operate. On July 5, *Fortune reported*:



> When Joshua Brown crashed and died in Florida on May 7 in a Tesla that was operating on autopilot-that is, Brown's hands were not on the wheel-the car company knew its duty. "Following our standard practice," Tesla said in a statement issued last Thursday, it "immediately" informed the National Highway Traffic Safety Administration about the accident.
> 
> So much for immediacy. The NHTSA sat on that news-of possible interest to the driving public, wouldn't you say?-until announcing it late last Thursday, June 30. That was almost eight weeks after the accident.
> 
> Tesla did something even more astounding. On May 18, eleven days after Brown died, Tesla and CEO Elon Musk, in combination (roughly three parts Tesla, one part Musk), sold more than $2 billion of Tesla stock in a public offering at a price of $215 per share-and did it without ever having released a word about the crash
> .
> To put things baldly, Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers.


The fact that the stock tanked only temporarily - right at this moment it is at $216.95 - doesn't make the delay in making sure the the public was aware of the information a bit less shady.

Tesla has, of course, publicly defended itself according to CNBC.

From the issue of the autopilot's safety, it's irrelevant. Musk's plan to combine Tesla Motors (which is also a battery manufacturer) with his solar installer SolarCity is more controversial in the world of financial analysts.

Unfortunately these financial manipulations should cause a moment's pause when looking at the brand overall. Historically, strong individual American innovators in the field of automobiles range from Henry Ford to Powel Crosley. Here's a picture of a 1939 Crosley:


----------



## dennisj00

I'm certainly not a big investor but isn't there a filing period (30 days or more) that a company / owner has to give notice of selling such a block of shares?

Edit: The CNBC article referenced above explains a SEC filing on May 10. . .


----------



## 4HiMarks

Almost all the trucks I see have at least some red and white reflective tape along the edges. Many semi trailers have it all the way around. How could the Tesla not be programmed to see that?


----------



## inkahauts

I saw an article today showing how a tesla traveling 80mph was launched into the air and actually rolled over. A semi evidently didn't see it and moved into its lane and hit it. Happened in Europe. 

It was not using auto pilot. 

The man and his 8 year old kid had minor bruises and scratches. 

The inside was basically intact (air bags did deploy) with not a scratch while the outside was obviously messed up pretty bad after rolling over at least once. 

The guy said the car saved his life for sure... and his kids...

Musk pointed out its more proof that they are the safest car around...

They say they are extremely hard to make roll over because it's center of gravity is so low with its battery packs. Makes sense and would be a nice bi product or batteries like that.

Look if people freak out because one person died in a tesla they aren't thinking. Thousands die from car accidents every year. The question is was he doing something eh shouldn't have been? Seems the driver that died probably was based on his videos but until it's proven either way... we really like to rush to judge. Where are all these reports of what the radar and cameras saw coming from? Sources?


----------



## trh

http://electrek.co/2016/07/01/understanding-fatal-tesla-accident-autopilot-nhtsa-probe/

See the Tweet from Elon Musk as part of the article.


----------



## James Long

inkahauts said:


> The question is was he doing something eh shouldn't have been?


The problem is that the car supported his actions. Tesla is not the only company to build in features that encourage not paying attention to the road and driving. Collision alert and automatic braking that take over control of the vehicle are nice features but they help people pay less attention to the environment. Don't forget to be the driver - not just the control passenger.

No automation is going to avoid every accident - no human is going to avoid every accident. But that doesn't mean that we should take the actuarial position, count deaths and injuries and say "good enough". We should always strive for better.


----------



## phrelin

The thing that drives many nuts, including Musk, is that while the article at the beginning says "the U.S. National Highway Traffic Safety Administration (NHTSA) launched preliminary evaluation" the headline and a sentence near the end call it an NHTSA "_probe_", a prejudicial word which the Dictionary.com sixth definition of the noun is "an investigation, especially by a legislative committee, of suspected illegal activity."

There is nothing illegal or untoward to "probe" but a lot to study and learn from. From the article, it is obvious that the unfortunately deceased driver understood this when he wrote on YouTube "A bigger danger at this stage of the development is getting someone too comfortable. You really do need to be paying attention at this point."

If we want to protect people from electronics causing accidents a good start would be to make any reception inside a vehicle of any outside radio signal of any kind impossible. That would eliminate smart phone distractions. :grin:


----------



## inkahauts

James Long said:


> The problem is that the car supported his actions. ......


Nope that's not a problem at all. Not an excuse either. Otherwise you'd blame every manufacturer for any accident involving any car that wasn't doing the speed limit. Car shouldn't have been able to go that fast so it's the cars fault. That would never fly.

That's almost like saying someone got killed because their seat belt wasn't on. It's the cars fault because it shouldn't go out of park till it is sure everyone's seat belts are on.

Nope technology in that car is able to be used but isn't the final word. The driver in tesla cars are the final word.


----------



## trh

I thought the truck made a left turn in front of the Tesla and was cited for failure to yield?


----------



## James Long

phrelin said:


> .. call it an NHTSA "_probe_", a prejudicial word which the Dictionary.com sixth definition of the noun is "an investigation, especially by a legislative committee, of suspected illegal activity."


I would not get hung up on the word. Merriam-Webster has the definition as "a careful examination or investigation of something" ... so perhaps it is dictionary.com that is prejudicial.


----------



## James Long

inkahauts said:


> Nope that's not a problem at all. Not an excuse either. Otherwise you'd blame every manufacturer for any accident involving any car that wasn't doing the speed limit. Car shouldn't have been able to go that fast so it's the cars fault. That would never fly.


How does the car know the speed limit? Integrated GPS and mapping software with a governor? If a car manufacture introduced such a feature and it failed (even though a database problem where the speed limit changed and the database had not been updated) expect the car to be blamed.

A car without a speed limit control leaves compliance in the hands of the driver. Just like a car without lane control or automatic braking. When the manufacturer introduces safety features they accept full responsibility that those features will work.



inkahauts said:


> Nope technology in that car is able to be used but isn't the final word. The driver in tesla cars are the final word.


The introduction of the features is a promise of safety. That promise should not be broken.


----------



## inkahauts

It's not been broken. And you seems to miss my point. Cars today are developed to be able to do things but that doesn't mean they get to make the final choices on what they do. They do a lot to help but they are still just helping. It's still the drivers responsibility to make sure the car is driving as it should be.


----------



## James Long

It sounds like you do not believe the maufacturer bears any responsibility for their product.
I disagree with such a stance.


----------



## inkahauts

No I didn't say that at all. I said that the manufacturer can add safety features till the cows come home and pigs fly them air support it doesn't change the fact that the driver is the one ultimately responsible for actually driving the car and making final judgments. Safety features are there to help warn you not to control you. You control then. 

This guy that got killed in theory decided he didn't want to be responsibility and that he wanted the car to be even though the cars says it's not allowed to be in charge. 

His judgment to allow autopilot to drive the car and not pay attention at all (if that's in fact what happened) is no different than letting a five year old drive the car and not paying attention as far as I'm concerned. I refuse to make excuses for people who make terrible decisions. Blaming the car at all if he wasn't paying any attention at all is wrong. 

That's not to say the car needs to learn from this and get better. It absolutely does but that doesn't negate who's fault it is. 

Someday that will change but we aren't there yet.


----------



## phrelin

It would, of course, be entirely possible to design sensors to determine if the person in the driver's seat is in fact alert and paying attention, say once every 10 seconds. If the car determines the person is not doing what they are supposed to be doing, the car could locate a safe place, pull off the road and park. It could even determine if the person is unconscious or dead and notify someone. And if the person is not dead maybe the car could even print out a citation/notice-to-appear, emailing a copy to the court???

Golly gee, the possibilities are endless....


----------



## James Long

Tesla and other manufacturers need to be careful when adding features to their cars. When first introduced "auto drive" led to drivers exhibiting extremely poor judgement. Tesla made a responsible decision to turn off "auto drive" except on certain types of roads. Perhaps the world was not ready for that feature at all. There is also supposed to be a system in place to make sure the driver keeps a hand on the wheel. Apparently something went wrong.

I spoke earlier in this thread about the two approaches to autonomous driving. Google is one company taking the cautious approach ... asking for government permission to perform road testing and limiting the speed and operation of their vehicles until they can be proven safe. There is also off road testing underway where controlled testing environments are in use.

Tesla took the more arrogant approach of "there is no law against it" and introduced a feature that led to the death of one of their customers. They already took a PR hit when some of their customers misused the new feature. I am sure there will be further accountability for their actions.


----------



## Tom Robertson

James Long said:


> Tesla and other manufacturers need to be careful when adding features to their cars. When first introduced "auto drive" led to drivers exhibiting extremely poor judgement. Tesla made a responsible decision to turn off "auto drive" except on certain types of roads. Perhaps the world was not ready for that feature at all. There is also supposed to be a system in place to make sure the driver keeps a hand on the wheel. Apparently something went wrong.
> 
> I spoke earlier in this thread about the two approaches to autonomous driving. Google is one company taking the cautious approach ... asking for government permission to perform road testing and limiting the speed and operation of their vehicles until they can be proven safe. There is also off road testing underway where controlled testing environments are in use.
> 
> Tesla took the more arrogant approach of "there is no law against it" and introduced a feature that led to the death of one of their customers. They already took a PR hit when some of their customers misused the new feature. I am sure there will be further accountability for their actions.


I had an eye opening moment listening to Google's experiences with their first tests. And how an employee stopped paying attention to get something from the backseat. Google and I knew then that the only real next step was fully autonomous (though some safety sensors and self defense mechanisms could be a small interim step.) Had I been Mr. Musk, I would have realized people are people. If there is nothing to hold their attention, they won't pay attention (as a general rule.)

So I would have stopped the beta test even before it was released, exactly as Google did.

Now, I don't know if Mr. Musk saw the Google experiences before Tesla released their version; I don't know if he ignored it; or exactly what happened.

Peace,
Tom


----------



## inkahauts

phrelin said:


> It would, of course, be entirely possible to design sensors to determine if the person in the driver's seat is in fact alert and paying attention, say once every 10 seconds. If the car determines the person is not doing what they are supposed to be doing, the car could locate a safe place, pull off the road and park. It could even determine if the person is unconscious or dead and notify someone. And if the person is not dead maybe the car could even print out a citation/notice-to-appear, emailing a copy to the court???
> 
> Golly gee, the possibilities are endless....


Exactly. At some point people need to take responsible no matter how many safety features are on a car. And until they say the car is fully self driving the buck stops at the driver.


----------



## inkahauts

James Long said:


> Tesla and other manufacturers need to be careful when adding features to their cars. When first introduced "auto drive" led to drivers exhibiting extremely poor judgement. Tesla made a responsible decision to turn off "auto drive" except on certain types of roads. Perhaps the world was not ready for that feature at all. There is also supposed to be a system in place to make sure the driver keeps a hand on the wheel. Apparently something went wrong.
> 
> I spoke earlier in this thread about the two approaches to autonomous driving. Google is one company taking the cautious approach ... asking for government permission to perform road testing and limiting the speed and operation of their vehicles until they can be proven safe. There is also off road testing underway where controlled testing environments are in use.
> 
> Tesla took the more arrogant approach of "there is no law against it" and introduced a feature that led to the death of one of their customers. They already took a PR hit when some of their customers misused the new feature. I am sure there will be further accountability for their actions.


Arrogant approach? I don't see that at all. Pushing the edge sure maybe. But I don't see anything wrong with what they are doing. They told people the proper way to use the autopilot. And you'd hope that someone who has the money to afford this car would pay attention to what they where
told about how it works and drive properly. But there will always be people who don't care what you tell them and do what they want. Otherwise we wouldn't have accidents at all. You just keep blaming tesla for having this feature. It's not tesla a fault they created something that has been on the market for years and has caused one death because someone didn't use it properly. Look at all the people who've died for the air bags and sudden acceleration that's hit Toyota and Lexus and so forth. And the rollover bad tire issue years ago. That's when the manufacturer is at fault. When they build something that doesn't perform as it's supposed to. Not when people don't perform as they are supposed to. Then it's there fault.


----------



## inkahauts

Tom Robertson said:


> I had an eye opening moment listening to Google's experiences with their first tests. And how an employee stopped paying attention to get something from the backseat. Google and I knew then that the only real next step was fully autonomous (though some safety sensors and self defense mechanisms could be a small interim step.) Had I been Mr. Musk, I would have realized people are people. If there is nothing to hold their attention, they won't pay attention (as a general rule.)
> 
> So I would have stopped the beta test even before it was released, exactly as Google did.
> 
> Now, I don't know if Mr. Musk saw the Google experiences before Tesla released their version; I don't know if he ignored it; or exactly what happened.
> 
> Peace,
> Tom


There's no way we get to true fully autonomous vehicles without what tesla is doing. None. There's to much to learn and deal with in the real world you can't learn by just trying in super controlled environments. And yes Google's experiment up north on regular roads is super controlled.

Plus tesla is pushing everyone else to get better at it including themselves.


----------



## Tom Robertson

inkahauts said:


> There's no way we get to true fully autonomous vehicles without what tesla is doing. None. There's to much to learn and deal with in the real world you can't learn by just trying in super controlled environments. And yes Google's experiment up north on regular roads is super controlled.
> 
> Plus tesla is pushing everyone else to get better at it including themselves.


Why do you think thus? Google is going to get there without doing what Tesla did. They learned from their experiences in the testing environment and realized the error of their ways. Why couldn't Tesla?

And do you think Tesla is a big enough competitor to Google that Google is at all pressured to more faster? Google wants to see safer cars first. Tesla seems to want sales first.

Peace,
Tom


----------



## James Long

inkahauts said:


> Arrogant approach? I don't see that at all. Pushing the edge sure maybe. But I don't see anything wrong with what they are doing. They told people the proper way to use the autopilot. And you'd hope that someone who has the money to afford this car would pay attention to what they where told about how it works and drive properly.


We have video evidence to the contrary. Tesla owners had to be pulled back from their own choices and have autopilot limited from the initial release.

I suppose we could install paintball launchers on vehicles and tell operators not to use them in an unsafe manner. And then excuse everyone but the operator when someone actually uses the launcher in a road rage incident. Sure, we'll provide instructions to only use water based paint that washes off and use the launcher off road with willing targets. That should be enough to avoid responsibility. 

Tesla took a risk. They lost. How much their loss will cost them is yet to be determined. Hopefully it will not affect responsible manufacturers.

Are we increasing safety or encouraging complacency?


----------



## Tom Robertson

James Long said:


> We have video evidence to the contrary. Tesla owners had to be pulled back from their own choices and have autopilot limited from the initial release.
> 
> I suppose we could install paintball launchers on vehicles and tell operators not to use them in an unsafe manner. And then excuse everyone but the operator when someone actually uses the launcher in a road rage incident. Sure, we'll provide instructions to only use water based paint that washes off and use the launcher off road with willing targets. That should be enough to avoid responsibility.
> 
> Tesla took a risk. They lost. How much their loss will cost them is yet to be determined. Hopefully it will not affect responsible manufacturers.
> 
> Are we increasing safety or encouraging complacency?


Would these be belt-fed or hopper-fed paintball launcher? 

I like your comparisons of the two companies and their approaches. well said.

Peace,
Tom


----------



## inkahauts

Tom Robertson said:


> Why do you think thus? Google is going to get there without doing what Tesla did. They learned from their experiences in the testing environment and realized the error of their ways. Why couldn't Tesla?
> 
> And do you think Tesla is a big enough competitor to Google that Google is at all pressured to more faster? Google wants to see safer cars first. Tesla seems to want sales first.
> 
> Peace,
> Tom


In what world has Google ever had a car doing 80 come across a situation that was identical to how that guy ended up dead? They haven't. The only reason they may even start to really look at it is because of what happened. Frankly everybody is likely to be helping everyone else in some ways on this technology. Everyone will be ifit from everyone else's mistakes and advances as all cars always have. Otherwise most cars wouldn't have air bags...

And Google isn't in competition with tesla or doing what they are. Google wants a car that has no steering wheel and to license the tech to everyone. Tesla wants to build a car that's safer than everyone else and then also add autopilot and is using the feedback to help alter and create better software from real world driving. Everyone suddenly claiming one accident means tesla has failed and Google's method is better is absurd. There's no where near enough info to make that kind of judgement. Ask again in ten years and then we will see, maybe.

Heck what will happen the first time someone gets killed in a car powered by Google software after their perceived far longer gestation period? Will they call the entire concept a total failure since Google after years still couldn't prevent a death?

My real point is Google can't perfect their software till it's in real conditions with actual regular cars on freeways and highways with other non autonomous cars no matter how much they test the way they are now. They will have issues when it gets unleashed.

Google doesn't want to release until it's a far different type of system than teslas is today... tesla isn't trying to hit a home run with its first at bat. Google will be.

Personally I find it's a lot harder to hit home runs don't you?


----------



## yosoyellobo

I don't ever expect Google or any other software to be perfect. That is why I don't really fear the upcoming robot revolution. If I am wrong you could sue my life estate.


----------



## inkahauts

James Long said:


> We have video evidence to the contrary. Tesla owners had to be pulled back from their own choices and have autopilot limited from the initial release.
> 
> I suppose we could install paintball launchers on vehicles and tell operators not to use them in an unsafe manner. And then excuse everyone but the operator when someone actually uses the launcher in a road rage incident. Sure, we'll provide instructions to only use water based paint that washes off and use the launcher off road with willing targets. That should be enough to avoid responsibility.
> 
> Tesla took a risk. They lost. How much their loss will cost them is yet to be determined. Hopefully it will not affect responsible manufacturers.
> 
> Are we increasing safety or encouraging complacency?


Tesla lost? What did they lose exactly? The first company to have a death when the car wasn't completely dependent on the driver? You think Google and everyone else won't have that happen someday? You'd be naïve to think

It's a sad thing, but we all had to know someone would die someday from an auto pilot feature on a car. Tesla was out first and therefore seemed like the company it'd happen to first and it did. Was it irresponsible for the car manufactures the first time an air bag killed a person who wasn't buckled into their seat properly because it wasn't explained well enough?

Paint guns really? To funny. Might as well have made an argument that we need any new gun control laws to also include outlawing people from buying advanced fighter jets because they can be as dangerous as a gun...

As for complacency or safety... probably both... but that begin when people started trying on cell phones and smart phones over home phones and calculators and a note pad and a pen.


----------



## inkahauts

yosoyellobo said:


> I don't ever expect Google or any other software to be perfect. That is why I don't really fear the upcoming robot revolution. If I am wrong you could sue my life estate.


How dare you be reasonable...


----------



## Tom Robertson

inkahauts said:


> In what world has Google ever had a car doing 80 come across a situation that was identical to how that guy ended up dead? They haven't. The only reason they may even start to really look at it is because of what happened. Frankly everybody is likely to be helping everyone else in some ways on this technology. Everyone will be ifit from everyone else's mistakes and advances as all cars always have. Otherwise most cars wouldn't have air bags...


Actually it was in the real world. Google worked on highway driving first. It's much easier. That is when they realized humans generally can't pay attention if they don't need to.



inkahauts said:


> And Google isn't in competition with tesla or doing what they are. Google wants a car that has no steering wheel and to license the tech to everyone. Tesla wants to build a car that's safer than everyone else and then also add autopilot and is using the feedback to help alter and create better software from real world driving. Everyone suddenly claiming one accident means tesla has failed and Google's method is better is absurd. There's no where near enough info to make that kind of judgement. Ask again in ten years and then we will see, maybe.


Twas you that indicated Tesla is pressuring Google and others. My reply was to that point.



inkahauts said:


> Heck what will happen the first time someone gets killed in a car powered by Google software after their perceived far longer gestation period? Will they call the entire concept a total failure since Google after years still couldn't prevent a death?
> 
> My real point is Google can't perfect their software till it's in real conditions with actual regular cars on freeways and highways with other non autonomous cars no matter how much they test the way they are now. They will have issues when it gets unleashed.
> 
> Google doesn't want to release until it's a far different type of system than teslas is today... tesla isn't trying to hit a home run with its first at bat. Google will be.
> 
> Personally I find it's a lot harder to hit home runs don't you?


My point is Tesla was irresponsible to release a feature that is patently unsafe insofar as it relies on humans paying attention watching paint dry. Humans generally can't do it.

Yes, people will die in Google cars. The promise is that fewer will die in Google cars than in normal cars. Let us celebrate that, rather than look for the impossible perfect car.

As for the sports metaphor, Tesla is playing baseball with rocket propelled balls, no batter's helmet, no catcher's mask or padding, and no gloves for the players on the field. More dangerous than regular baseball. The only choice is to hit it out of the park.

Or we could skip the meaningless metaphor altogether. As this isn't a percentage of attempts game. The goal is to make cars safer. After these experiences, there are very few ways to release partial feature sets that are safer. Anything that requires the human to pay attention for long periods of semi-autonomous driving will be less safe. And thus irresponsible to release.

So Google has the right idea--test, test, test. First in controlled environments, then in less controlled environs, all the way up to fully uncontrolled, normal, everyday driving. It will take time, but unlike hitting a homerun, this is more like a marathon. You map the plan, set the milestones and provisions along the way, and keep moving forward to success.

Peace,
Tom


----------



## James Long

inkahauts said:


> Was it irresponsible for the car manufactures the first time an air bag killed a person who wasn't buckled into their seat properly because it wasn't explained well enough?


The first time? That depends on how predicable that death was. There was an abundance of evidence that Tesla's autopilot was being misused (to the point where they changed the software). Once people without seatbelts were killed by airbags manufacturers took action: Disabling airbags when seatbelts were not in use. An appropriate response. Did Tesla go far enough in their response to evidence of dangerous driving?



inkahauts said:


> Paint guns really? To funny. Might as well have made an argument that we need any new gun control laws to also include outlawing people from buying advanced fighter jets because they can be as dangerous as a gun...


Can be? I should think they are as dangerous as a gun. Just like a Tesla in the hands of a distracted driver. Fortunately this particular bullet didn't strike a school bus or van load of kids after the collision with the semi. The accident could have been worse.


----------



## Tom Robertson

inkahauts said:


> Tesla lost? What did they lose exactly? The first company to have a death when the car wasn't completely dependent on the driver? You think Google and everyone else won't have that happen someday? You'd be naïve to think
> 
> It's a sad thing, but we all had to know someone would die someday from an auto pilot feature on a car. Tesla was out first and therefore seemed like the company it'd happen to first and it did. Was it irresponsible for the car manufactures the first time an air bag killed a person who wasn't buckled into their seat properly because it wasn't explained well enough?
> 
> Paint guns really? To funny. Might as well have made an argument that we need any new gun control laws to also include outlawing people from buying advanced fighter jets because they can be as dangerous as a gun...
> 
> As for complacency or safety... probably both... but that begin when people started trying on cell phones and smart phones over home phones and calculators and a note pad and a pen.


Directly, Tesla lost market value. And they may have some SEC violations awaiting.

They have a perception problem with some potential owners. And a PR problem.

Not from the fact that someone died in their car, but that someone died because they released software that is perhaps negligently unsound. They can't rely on humans paying attention and the software isn't ready for fully autonomous driving.

The next potential losses will come after the NHTSA findings. They might find Tesla was irresponsible for assuming humans can stay awake with nothing to do. Certainly Google's experiences should have been a lesson and warning. And they might find Tesla liable for damages. Or let the courts decide that, should the family sue.

This isn't a game of sandlot baseball where nobody keeps score, everyone has a few beers afterward, and plays for bar honors. 

Peace,
Tom


----------



## Tom Robertson

James Long said:


> The first time? That depends on how predicable that death was. There was an abundance of evidence that Tesla's autopilot was being misused (to the point where they changed the software)....


Bingo! Negligence is continuing to do or not do in light of known dangers. Even if Tesla hadn't learned from Google, they showed they learned from their own fan videos. Yet not enough to protect drivers.

I'm not saying this accident was preventable via an awake driver nor with better software. I'd like to think it would be, yet I realize accidents will happen. I'm saying it feels negligent to release something less than fully autonomous and yet has known issues.

Peace,
Tom


----------



## dennisj00

First let me say I'm in the Tesla camp because I've ordered one - several weeks to be delivered, and have driven both the S and X over 100 miles in autopilot.

From that experience, I don't ever see a fully autonomous car (Google) with no steering / pedals ever on our roads until they have communication with each other. That will take decades to achieve - no matter the safety record.

From my experience with driving the AP of Tesla, it takes some confidence building, much like sitting in the passenger seat and depending on your spouse or significant other or just the driver you're not familiar with, to get use to the lane centering, acceleration and handling that you wouldn't be comfortable with yourself. But once you realize it's really in control - and monitor that control - it does a great job.

Would I climb in the back seat? No. It does nag you to put your hands back on the wheel. Haven't tested if it shuts down if you don't.

I don't think it's a loss for Tesla. I think they added to their data points that they're sharing with the NTSB. There will always be accidents that may not be preventable who ever is driving. If the system cuts the carnage by 50%, who loses?

I've gathered all my board games, scrabble, backgammon, Monopoly -- unfortunately Monopoly won't fit in the glove box and will probably get spread over the back seat!

Looking forward to becoming an owner.


----------



## James Long

I do not consider "autodrive" to be a safety feature. To be a safety feature it has to be better than the alternative. Is autodrive safer than an attentive, alert driver? Or is it just safer than the average distracted driver?

If the car sees that it is better to speed up or slow down or change lanes why didn't the human driver make that choice? Is the car paying more attention to the road than the driver?


Not my brand, but I can't get this commercial out of my mind:
https://www.youtube.com/watch?v=eFkTEwtxX5c


----------



## billsharpe

Truly self-driving cars will work if ALL the cars on the road are self-driving. At least at this stage of development there are too many poor drivers around for software to handle all the mistakes these drivers make.

I turn off cruise control when traffic is heavy. In the LA area it's always off. It's only on during long trips out of town.


----------



## Tom Robertson

billsharpe said:


> Truly self-driving cars will work if ALL the cars on the road are self-driving. At least at this stage of development there are too many poor drivers around for software to handle all the mistakes these drivers make.
> 
> I turn off cruise control when traffic is heavy. In the LA area it's always off. It's only on during long trips out of town.


I (and Google) disagree.

Though I bet we agree that having humans behind the wheels of some cars on the road makes the whole problem much more difficult. And controlling bicycles, driving scooters/wheelchairs, etc. 

And why the computing power is pretty impressive in these cars. They have many competing heuristics to balance in looking for safe passage while traveling faster than 5 mph. 

Listening to and reading the commentaries of the engineers is incredibly insightful. They run into situations no one would think of on their own. Millions of miles to go before Google is ready (and they know it). 

Peace,
Tom


----------



## Tom Robertson

One more thought from Bill Sharpe's comment. 

I also agree that some of the promise of fully autonomous cars, closer packing of cars on the road fro instance, can't really happen until nearly all the cars are fully autonomous and conversing. 

Peace,
Tom


----------



## yosoyellobo

I surpose that to Google driving at 65 might in the long run be easer the crawling along at 25. Less kids, animals, lawn mowers, etc.


----------



## James Long

Less inputs processed faster? Less time to decide if the object detected ahead is a threat that needs to be avoided by stopping or veering around.

For now the slower speeds allow the human "monitors" to make sure the computers are right. Life happens fast at 65 MPH.


----------



## Tom Robertson

James Long said:


> Less inputs processed faster? Less time to decide if the object detected ahead is a threat that needs to be avoided by stopping or veering around.
> 
> For now the slower speeds allow the human "monitors" to make sure the computers are right. Life happens fast at 65 MPH.


If one considers environments rather than speed, highway driving is a much simpler problem than stop and go city traffic. Far fewer moving parts, mostly moving in the same direction. 

Google started first with highways and had that down before moving to city streets. It had been their plan to release a semi-autonomous mode for highway driving--until they learned humans can't watch paint dry. 

Peace,
Tom


----------



## James Long

The fatality rate is much higher at 65 MPH than 25 MPH. Side impact fatality rate is fairly high at 45 MPH. Of course, fatality rates only matter if there is an accident. But a cautious approach isn't a bad thing.

A driver makes a mistake ... what are the repercussions? At 25 MPH slight injury is more likely than death. At 65 MPH death becomes more likely. Each speed has it's own challenge but the potential penalty is higher at higher speeds.

Some of the rural interstates I have driven were like watching paint dry. Cruise control helps me not drift up and down in speed (mostly up trying to get that segment of road over with). If the road gets too boring I'll change to a state route that is more interesting to drive.


----------



## inkahauts

James Long said:


> I do not consider "autodrive" to be a safety feature. To be a safety feature it has to be better than the alternative. Is autodrive safer than an attentive, alert driver? Or is it just safer than the average distracted driver?
> 
> If the car sees that it is better to speed up or slow down or change lanes why didn't the human driver make that choice? Is the car paying more attention to the road than the driver?
> 
> Not my brand, but I can't get this commercial out of my mind:


Your argument makes no sense because you assume everyone is always operating their car properly. And we have seen videos where the tesla was able to swerve and avoid an accident. Some have said, yes, but look at how far it swerved etc. OI say, show me any human that always knows when and how far over they can swerve without hitting another vehicle as well as a car using all its sensors constantly.

Your argument would basically say that air bags dont add to safety either because a perfectly alert person would never get in an accident, which is of course the best alternative to having an air bag.

Auto pilot is a better alternative to someone falling asleep at the wheel or getting distracted by someone else spilling a coffee, or any number of other things that can happen, not to mention better reflexes than some people who drive that are much older...

I would like to know how they have set up the cars for dealing with things like slick roads and icy roads. They might be better than some people who don't know how to steer if they start hydroplaning etc... Another possible saftey feature.

Autopiloit can be a big safety feature for anyone that is to tired to be driving and dowsing off... That's an obvious saftey feature too over someone just losing control.


----------



## inkahauts

Tom Robertson said:


> Bingo! Negligence is continuing to do or not do in light of known dangers. Even if Tesla hadn't learned from Google, they showed they learned from their own fan videos. Yet not enough to protect drivers.
> 
> I'm not saying this accident was preventable via an awake driver nor with better software. I'd like to think it would be, yet I realize accidents will happen. I'm saying it feels negligent to release something less than fully autonomous and yet has known issues.
> 
> Peace,
> Tom


I have never once seen Tesla say that people should use AP and not pay attention to driving, so its the drivers that are negligent are they not? And at what point do you say, I have done enough to protect drivers and a few are still being idiots, so take away their drivers license. Not the feature, their personal drivers license. That's what we do when someone drives 100 mph on a freeway and gets caught isn't it? (it is or can be in california) How is doing that any different (and frankly not far far more dangerous usually) than using AP?

Every argument you and James have made has really been, not all drivers are smart enough to use it so they shouldn't make it available yet. I am tired of people making excuses for idiots, in general, and this is just one more example. I am sorry, but the guy that died, he was being foolish. He proved that by showing he so often didn't pay attention to the car which is directly against how he was told to use it. That's HIS fault, not tesla. Its liek the motorcyclist doing a wheely on a freeway at 65 mph. You die its your fault for doing something you know you shouldn't do. Should they stop building motorcycles because peopel can do that with them?

Can tesla do a few things to maybe help this situation.. Sure, you can always do better, but that doesn't mean what they have done is bad or wrong. Change the name to Pilot Assist, not auto pilot.... Maybe If you take your hands off the wheels for to many minutes to often over a short period of time, automatically pull the car over and park and disengage the ap ability for the rest of the day. Punish the driver for being stupid... I don't know, you could come up with more, but really, its still all about the driver and not the car.

Have you ever used adaptive cruise control? Do you think that shouldn't be allowed either because its not fully autonomous? I don't know that I ever want a car that's fully autonomous that I cant take over control from... I don't expect we will ever see anyone go strait to that format without first having the exact same software in a car that has controls for the person. And I also wouldn't be surprised to see a car without human controls to first be limited to 35 mph or less even...


----------



## James Long

inkahauts said:


> Your argument makes no sense because you assume everyone is always operating their car properly.


Perhaps you should present YOUR argument in favor of Tesla's arrogance instead of failing at presenting my arguments? You are not doing a good job of presenting my views.

BTW: Are you claiming that the car is always operating properly? If so, perhaps you should read Tesla's statement where they admitted that the car did not detect and avoid the collision.

Sure humans make mistakes. One of them is assuming that Tesla's auto drive is safe.

A "safety feature" that provides a false sense of security and enables people to pay less attention to the most important thing that they are doing ... piloting a vehicle. That definition of safe is not in my dictionary.


----------



## trh

I want to see the full accident report. The truck driver failed to yield while turning left and crossed the road the Tesla was on. Obviously the driver wasn't paying attention as he didn't brake. But could he have stopped in time had he been paying attention? Was there enough time for any vehicle or driver to stop? 

I've read the driver was cited for failure to yield. If so, he is the reason for this accident; not the driver or Tesla.


----------



## inkahauts

James Long said:


> Perhaps you should present YOUR argument in favor of Tesla's arrogance instead of failing at presenting my arguments? You are not doing a good job of presenting my views.
> 
> BTW: Are you claiming that the car is always operating properly? If so, perhaps you should read Tesla's statement where they admitted that the car did not detect and avoid the collision.
> 
> Sure humans make mistakes. One of them is assuming that Tesla's auto drive is safe.
> 
> A "safety feature" that provides a false sense of security and enables people to pay less attention to the most important thing that they are doing ... piloting a vehicle. That definition of safe is not in my dictionary.


Since when has a safety feature ever been touted as letting you not do your job properly? I've never seen any company say the alerts for when cars are in the lane next to you in the blind spot of a mirror means you don't need to still follow the proper procedure and look over your shoulder and out the window to verify if there is a car there. Can you point to someone ever saying that? It's the EXACT same thing. No matter how much you want to argue otherwise.

AP is not a fully automated driving system. They have stated that and told you how to properly use it...

And I find it so funny you think I'm defending their arrogance. First I don't think what they are doing is arrogant maybe you should check into what that word actually means.

Second I'm saying no matter what the driver was NOT doing what he was supposed to be so don't blame the entire thing on tesla. I never said their system didn't detect the truck. It's an issue they can learn from. I said that. But that doesn't take away the drivers responsibility in this. Not in the least. You seem to think the only thing that caused this was tesla. I can't see how anyone could rationally think that if they understand the concept of being responsible in how they operate cars.


----------



## inkahauts

trh said:


> I want to see the full accident report. The truck driver failed to yield while turning left and crossed the road the Tesla was on. Obviously the driver wasn't paying attention as he didn't brake. But could he have stopped in time had he been paying attention? Was there enough time for any vehicle or driver to stop?
> 
> I've read the driver was cited for failure to yield. If so, he is the reason for this accident; not the driver or Tesla.


Completely agree. But many will still say that since it was on ap the car should have been able to avoid the accident no matter what. Which just isn't true.


----------



## dennisj00

I have to agree with Ink's comments. From the 100 or so miles I've driven under AP, it's done a very good job but I will not avert my attention. It continues to remind you to put your hands on the wheel and you certainly can't leave the seat.

To blame Tesla for these accidents would implicate any manufacturer in any accident, no matter what the idiot driver did.

And I don't agree with the 'arrogance' of Tesla. There's a clear warning that it is beta software and should be used with caution. Consumer Reports has called for them to rein it in. Consumer Reports should stay out of it. I doubt they've really tested it with no bias.


----------



## Tom Robertson

inkahauts said:


> I have never once seen Tesla say that people should use AP and not pay attention to driving, so its the drivers that are negligent are they not? And at what point do you say, I have done enough to protect drivers and a few are still being idiots, so take away their drivers license. Not the feature, their personal drivers license. That's what we do when someone drives 100 mph on a freeway and gets caught isn't it? (it is or can be in california) How is doing that any different (and frankly not far far more dangerous usually) than using AP?
> ...


So you're saying AP is dangerous? 

This is not an all or nothing game. Both Tesla and the drivers can be negligent.



inkahauts said:


> ...
> Every argument you and James have made has really been, not all drivers are smart enough to use it so they shouldn't make it available yet. I am tired of people making excuses for idiots, in general, and this is just one more example. I am sorry, but the guy that died, he was being foolish. He proved that by showing he so often didn't pay attention to the car which is directly against how he was told to use it. That's HIS fault, not tesla. Its liek the motorcyclist doing a wheely on a freeway at 65 mph. You die its your fault for doing something you know you shouldn't do. Should they stop building motorcycles because peopel can do that with them?
> ....


Actually in my argument, intelligence is not a factor. My argument is humans are not evolved to sit and watch paint dry for extended time periods. Humans generally won't pay attention to the driving going on around them if they are not actually involved in the driving. (Heck, they don't pay attention when they are involved!) Lack of intelligence, or wisdom is ignoring how humans generally can't sustain the necessary attention.



inkahauts said:


> ...
> Can tesla do a few things to maybe help this situation.. Sure, you can always do better, but that doesn't mean what they have done is bad or wrong. Change the name to Pilot Assist, not auto pilot.... Maybe If you take your hands off the wheels for to many minutes to often over a short period of time, automatically pull the car over and park and disengage the ap ability for the rest of the day. Punish the driver for being stupid... I don't know, you could come up with more, but really, its still all about the driver and not the car.
> 
> Have you ever used adaptive cruise control? Do you think that shouldn't be allowed either because its not fully autonomous? I don't know that I ever want a car that's fully autonomous that I cant take over control from... I don't expect we will ever see anyone go strait to that format without first having the exact same software in a car that has controls for the person. And I also wouldn't be surprised to see a car without human controls to first be limited to 35 mph or less even...


Cruise control, adaptive or otherwise, does reduce the amount of awareness the driver needs to maintain. Yet the driver still has to maintain enough awareness to steer, at a much higher cognitive level than keeping the appropriate speed. Now, some people will go to sleep easier with cruise control engaged; they should not use cruise control. Yet since the general populace doesn't fall asleep from using cruise control, it is ok.

Telsa's feature relies on humans doing something most people can't do. It's akin to having fixed position seats in the car, set for the tallest person. Sure, some people could drive the car safely--but most can't drive it safely that way. Therein is the problem, relying on something humans can't naturally do (nor are likely to develop a skill.)

Google gets it. Tesla almost gets it--they adjusted. Yet they stubbornly cling to the notion that it is reasonable to have inactive humans pay attention while the paint dries.

You also seem to ignore Google's other findings. Highways were an easy problem to solve. (Ok, relatively speaking.) They had that working originally. Local, in city driving is the hard one to solve. And taking much, much longer because there are so many more variables and moving pieces.

Peace,
Tom


----------



## Tom Robertson

dennisj00 said:


> I have to agree with Ink's comments. From the 100 or so miles I've driven under AP, it's done a very good job but I will not avert my attention. It continues to remind you to put your hands on the wheel and you certainly can't leave the seat.
> 
> To blame Tesla for these accidents would implicate any manufacturer in any accident, no matter what the idiot driver did.
> 
> And I don't agree with the 'arrogance' of Tesla. There's a clear warning that it is beta software and should be used with caution. Consumer Reports has called for them to rein it in. Consumer Reports should stay out of it. I doubt they've really tested it with no bias.


Try another couple hundred miles. Let's see if you maintain the same level of attention with AP driving as you have whilst you drive after you've let AP handle things longer.

The arrogance on Telsa's part is that humans generally can pay attention while paint dries. They can't. Google figured it out. Telsa thought they had a workaround--but they don't.

Peace,
Tom


----------



## dennisj00

Tom Robertson said:


> Try another couple hundred miles. Let's see if you maintain the same level of attention with AP driving as you have whilst you drive after you've let AP handle things longer.
> 
> The arrogance on Telsa's part is that humans generally can pay attention while paint dries. They can't. Google figured it out. Telsa thought they had a workaround--but they don't.
> 
> Peace,
> Tom


Give me a couple of weeks as an owner. . . as opposed to many opinions here that have never used AP.

As I said earlier, Google's control-less cars will never happen unless they're in a closed sandbox. Decades away.

It doesn't matter what 'safety feature' you discount. People still don't wear seat belts.


----------



## inkahauts

Tom Robertson said:


> Try another couple hundred miles. Let's see if you maintain the same level of attention with AP driving as you have whilst you drive after you've let AP handle things longer.
> 
> The arrogance on Telsa's part is that humans generally can pay attention while paint dries. They can't. Google figured it out. Telsa thought they had a workaround--but they don't.
> 
> Peace,
> Tom


I can easily say it's arrogant for people to always want to do something for the lowest common denominator, like the idiot that can not understand how to still pay attention to what the car is doing when on AP. If he's that much an idiot the people around him are probably still safer with the car on AP than if he's in full control!

And Google hasn't figured out anything more or less than tesla from what I've read or heard or seen... different approaches don't make one or the other better.

A radio host lately has been saying something a lot that I must agree with. Why is it people think that if you like one thing you must hate all others? I don't think there is anything wrong with either companies way of doing things and they will eventually lead both companies to the same destination but Google's IMHO is not any safer than teslas.

In fact IMHO it might cause google To believe they are safer than they are when they finally fully unleashed their systems on regular drivers. But I'm hopping that doesn't happen.

But needless to say this isn't a case where I believe anyone can make an argument that either one of these companies is doing it much better than the other.


----------



## Tom Robertson

dennisj00 said:


> Give me a couple of weeks as an owner. . . as opposed to many opinions here that have never used AP.
> 
> As I said earlier, Google's control-less cars will never happen unless they're in a closed sandbox. Decades away.
> 
> It doesn't matter what 'safety feature' you discount. People still don't wear seat belts.


You've set a false equivalence in that last argument. That all "safety features" are the same.

And for that matter, argued for the use of one safety feature by describing how people don't use another. Some people won't use AP at all. How does that affect whether or not it is a wise feature to include in a car? 

Peace,
Tom


----------



## Tom Robertson

inkahauts said:


> I can easily say it's arrogant for people to always want to do something for the lowest common denominator, like the idiot that can not understand how to still pay attention to what the car is doing when on AP. If he's that much an idiot the people around him are probably still safer with the car on AP than if he's in full control!
> 
> And Google hasn't figured out anything more or less than tesla from what I've read or heard or seen... different approaches don't make one or the other better.
> 
> A radio host lately has been saying something a lot that I must agree with. Why is it people think that if you like one thing you must hate all others? I don't think there is anything wrong with either companies way of doing things and they will eventually lead both companies to the same destination but Google's IMHO is not any safer than teslas.
> 
> In fact IMHO it might cause google To believe they are safer than they are when they finally fully unleashed their systems on regular drivers. But I'm hopping that doesn't happen.
> 
> But needless to say this isn't a case where I believe anyone can make an argument that either one of these companies is doing it much better than the other.


"lowest common demoninator?" Are you trying to claim "most people" could pay attention while not having to do any of the driving? For safety sake, how high of a percentage of people would you expect in the basis of a feature that is in beta test? Especially considering we're talking about human lives on the line.

Why do you presume that I'm hating the other approach simply because I like the Google approach? Is it possible I did some (actually rather basic) analysis and realized that Google was spot on when they realized that humans aren't geared for long-term attention when they are not engaged. Take away the engagement, humans will stop paying attention.

You may be one of the few who can continue to pay attention day after day of beta testing AP. That doesn't mean most people can. What if only 50% of people can? Wouldn't that mean the feature is too unsafe for general use in a beta test?

Seems pretty arrogant to me for Tesla to toss this feature out there in beta test without some basic human factors measurements.

Peace,
Tom


----------



## James Long

inkahauts said:


> Since when has a safety feature ever been touted as letting you not do your job properly?


Apparently you have missed the Jim Gaffigan commercial where the car does all of the driving (automatic parking) while he pays no attention to its driving. And the commercial with the dad dropping off his daughter at school getting cut off by a convertible. Or the commercial with the two women oogling guys instead of watching the road. Or the commercial where the teenage boy does the same.

I'd say the latter examples (auto braking) are all examples of automakers selling a safety feature as a way drivers do not have to be as attentive as they should be. The first example isn't a safety feature (auto park) but it is an example of taking a complex driving skill out or the driver's hand and advertising that the driver does not even need to pay attention.

Lane control (alerting the driver that they are drifting - possibly even steering back into the lane) would be a safety feature but auto drive where the car makes lane changes is not a safety feature.

Safety features make driving safer. If you are arguing that auto drive is a safety feature because it is safer than an inattentive driver you are basically proving my point that the feature encourages inattentive drivers.


----------



## trh

One thing that AP can do is simultaneously monitor all it's sensors and react. If I'm checking my mirrors while driving, it is possible something can happen during that half second I'm looking away that would impact my response time. 

I'm not being inattentive, but AP is a safety feature that can help. 

It was 130 million passenger miles using AP before a fatality. US average is 100 million passenger miles. That is a significant improvement.


----------



## dennisj00

I'm no longer arguing opinions, my delivery has been moved to next week.

My sister just had an accident a few weeks ago that the sensors would have prevented. The car was totaled.


----------



## dennisj00

trh said:


> One thing that AP can do is simultaneously monitor all it's sensors and react. If I'm checking my mirrors while driving, it is possible something can happen during that half second I'm looking away that would impact my response time.
> 
> I'm not being inattentive, but AP is a safety feature that can help.
> 
> It was 130 million passenger miles using AP before a fatality. US average is 100 million passenger miles. That is a significant improvement.


And that was only 1 manufacturer with 130 million passenger miles!


----------



## dennisj00

James Long said:


> Lane control (alerting the driver that they are drifting - possibly even steering back into the lane) would be a safety feature but auto drive where the car makes lane changes is not a safety feature.


Again, no experience with AP. While it does alert drift and correct, it does not change lanes by itself. You have to initiate lane change by the turn signal. If it's clear it will change lanes, if not it doesn't.


----------



## James Long

A video posted by the now deceased Tesla driver showed his Tesla moving over to the shoulder to avoid an accident. The driver was engaged in an audio book (and who knows what else off camera) and did not notice a truck attempting to merge to the right hand lane. Isn't that an automated lane change?


----------



## dennisj00

I would call it more of collision avoidance. We don't know if it returned to the original lane when the obstruction was gone.


----------



## James Long

dennisj00 said:


> I would call it more of collision avoidance. We don't know if it returned to the original lane when the obstruction was gone.


The driver resumed manual control of the vehicle.


----------



## dennisj00

So we don't know if it would have returned to the original lane.

Tesla released today that the Pennsylvania car in the accident disengaged AP 25 seconds before the crash because the driver didn't have his hands on the wheel. I suppose you'll condemn AP because it disengaged - when the driver was obviously in error?


----------



## James Long

My rebuke of Tesla is that the feature was able to be engaged.

I believe they should have withdrawn the feature completely when they discovered people abusing the feature.


----------



## trh

dennisj00 said:


> So we don't know if it would have returned to the original lane.
> 
> Tesla released today that the Pennsylvania car in the accident disengaged AP 25 seconds before the crash because the driver didn't have his hands on the wheel. I suppose you'll condemn AP because it disengaged - when the driver was obviously in error?


I'll be interested in your experiences with AP. I've read where with the fatal accident, the car was going very fast (around 100 mph) by a driver who said she was doing 85 and the Tesla passed her. But I've also read that if AP is engaged, it is limited to going only 5 mph over the posted speed limit. The truck driver (who is a very biased witness as far as I'm concerned) also said the car was going very fast.


----------



## trh

James Long said:


> My rebuke of Tesla is that the feature was able to be engaged.
> 
> I believe they should have withdrawn the feature completely when they discovered people abusing the feature.


You also make it sound like listening to an audio book was a distraction. Are you advocating disabling all radios in cars?


----------



## dennisj00

James Long said:


> My rebuke of Tesla is that the feature was able to be engaged.
> 
> I believe they should have withdrawn the feature completely when they discovered people abusing the feature.


With this opinion, we'd still be driving horse-drawn buggies.

Again, remove seat belts, some people don't use them.


----------



## James Long

trh said:


> You also make it sound like listening to an audio book was a distraction. Are you advocating disabling all radios in cars?


Why must all comments become arguments of the extremes?

In my opinion, the driver was bored. His car was driving for him and he didn't need to pay attention. So he sought out other options. An audio book caught on tape in the earlier incident. (Which would have easily been prevented by an engaged driver - there was not an overwhelming amount of traffic and the only reason why the car had to take evasive action was because the driver wasn't paying attention.) A DVD player was found in the fatal wreck

Being bored while driving is a sign that one needs to be more engaged, not more distracted.

As long as one's attention remains on the road and is not distracted by the radio it does no harm. But if one is distracted while driving it is time to turn off everything extraneous and think about what one is doing - driving.



dennisj00 said:


> With this opinion, we'd still be driving horse-drawn buggies.
> 
> Again, remove seat belts, some people don't use them.


Great ... more insults. Just when I thought the thread was rational again.
Oh well. When you're ready for rational discussion I'll be here.


----------



## dennisj00

trh said:


> I'll be interested in your experiences with AP. I've read where with the fatal accident, the car was going very fast (around 100 mph) by a driver who said she was doing 85 and the Tesla passed her. But I've also read that if AP is engaged, it is limited to going only 5 mph over the posted speed limit. The truck driver (who is a very biased witness as far as I'm concerned) also said the car was going very fast.


Eye witnesses are very bad in their estimates.

You can set the AP or the cruise control to any level over the posted. There is a default that you can set +5 or whatever over the posted. I think AP is limited to 90 max. But it also reminds you to keep your hands on the wheel and will stop if you don't. (Today's announcement from Tesla on the Penn. accident)

Trust me, If I'm doing 90, I'm watching! This is a silly thread / argument. While it was one fatality and unfortunate, how many people died that one day in auto accidents? How many miles were driven that day by AP?


----------



## James Long

dennisj00 said:


> How many miles were driven that day by AP?


One too many.

BTW: Where was the "Penn." accident? The fatal crash I have been discussing occurred May 7 when Joshua Brown of Canton, Ohio, hit the side of a tractor trailer on U.S. Route-27 in Williston, Fla.

EDIT: Oh, the next accident!

Driver Albert Scaglione had reportedly told police that he activated the automated system before the July 1 rollover accident on the turnpike in Bedford County.

A Tesla representive told CNN Money that the autopilot system disengaged because the driver's hands weren't on the wheel. The driver was instructed to retake the wheel and did so 11 seconds before the crash, according to the company.

"Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle," Tesla told CNN.

http://www.philly.com/philly/blogs/real-time/Tesla-Autopilot-was-off-in-Pa-Turnpike-crash.html


----------



## dennisj00

Honestly, James, I don't get it. You admittedly call the driver bored and that he didn't need to pay attention. You completely shirk his responsibility as a driver and blame it on AP.

I'm a private pilot and flew for a while with an AutoPilot but I didn't ignore what was going on. Nor would I have blamed it on Cessna if I had crashed.

Cell phones and texting are a much bigger problem while driving.


----------



## dennisj00

James Long said:


> One too many.
> 
> BTW: Where was the "Penn." accident? The fatal crash occurred May 7 when Joshua Brown of Canton, Ohio, hit the side of a tractor trailer on U.S. Route-27 in Williston, Fla.
> 
> EDIT: Oh, the next fatal accident!
> http://www.latimes.com/business/autos/la-fi-hy-tesla-autopilot-20160715-snap-story.html


It wasn't fatal.

And I remind you of the report that in1.3 Million miles under AP, airbag deployments were down 50%.

Find another manufacturer with that record.


----------



## James Long

dennisj00 said:


> Honestly, James, I don't get it. You admittedly call the driver bored and that he didn't need to pay attention. You completely shirk his responsibility as a driver and blame it on AP.


That is the problem with Tesla's arrogant approach. Tesla drivers on auto pilot do not need to pay attention. They just need to have enough contact with the car that the car is fooled into detecting "attention".

Disengaging 10-11 seconds before an accident sounds like a good way for Tesla to shirk responsibility. The system was off! It had been off for 10-11 seconds! It isn't our fault!

The Google studies previously reported by Tom highlight the dangers involved. In its current form,Tesla's high speed autodrive is an unsafe feature.



dennisj00 said:


> It wasn't fatal.


The way you inserted it into the thread I thought it was. We were discussing a fatal accident in Florida. I thought you were talking about that accident when you mentioned the Penn. accident.

I corrected my post with a better reference.


----------



## dennisj00

James Long said:


> That is the problem with Tesla's arrogant approach. *Tesla drivers on auto pilot do not need to pay attention.* They just need to have enough contact with the car that the car is fooled into detecting "attention".
> 
> Disengaging 10-11 seconds before an accident sounds like a good way for Tesla to shirk responsibility. The system was off! It had been off for 10-11 seconds! It isn't our fault!
> 
> The Google studies previously reported by Tom highlight the dangers involved. In its current form, autodrive is an unsafe feature.


Where do you conclude that 'Tesla drivers on auto pilot do not need to pay attention'? Maybe we need every car to alert the driver that if you start this car you could have an accident!! That would save more lives than have happened in Tesla accidents.

There are uTube videos of idiots abusing anything you can imagine. So we ban everything?

My advice to you - don't drive your car.


----------



## trh

James Long said:


> In its current form,Tesla's high speed autodrive is an unsafe feature.


But statistically, with 130 million miles and 1 fatality, the Tesla AP is 30% safer than other cars on US roads.


----------



## James Long

dennisj00 said:


> Where do you conclude that 'Tesla drivers on auto pilot do not need to pay attention'?


They have given that power to their customers and then act surprised when they use it.
Perhaps Tesla drivers are not mature enough to have auto drive?

Some are ... and I am sure you will be the perfect driver staying alert and in control at all times. Just like you are today. But please be honest enough to let us know if you fall victim to the complacency publicly demonstrated by other users of auto drive.


----------



## dennisj00

James Long said:


> They have given that power to their customers and then act surprised when they use it.
> Perhaps Tesla drivers are not mature enough to have auto drive?
> 
> Some are ... and I am sure you will be the perfect driver staying alert and in control at all times. Just like you are today. But please be honest enough to let us know if you fall victim to the complacency publicly demonstrated by other users of auto drive.


All I can say is generalizations are a very bad thing. We have one driver that lost his life so that brands everyone and the manufacturer?

I'll (and my wife) will put some miles on later this week - we'll keep you posted.


----------



## James Long

dennisj00 said:


> All I can say is generalizations are a very bad thing. We have one driver that lost his life so that brands everyone and the manufacturer?


Unfortunately Mr Brown was not the only one to post videos of his Tesla driving. He is the only one to pay with his life (so far, that we know of) and no non-driver has been killed in a Tesla auto drive connected accident (so far, that we know of). The problem with Tesla auto drive is not limited to one or two drivers.

As I have posted before, Tesla responded to the reports of their drivers "misbehaving". Was their response good enough? I suppose we will see as further accidents are reported.


----------



## James Long

Another accident (July 9th) - Driver's fault for not disengaging Auto Pilot after leaving an interstate:


> The owner, who would only identify himself by his last name -- Pang -- said he's not yet sure whether the accident was the car's fault or his fault. He said he's eager to talk to Tesla and learn why the car swerved off a narrow Montana road.
> 
> Pang was heading from Seattle to Yellowstone National Park when he crashed on a two-lane highway near Cardwell, at 12:30 a.m. Saturday, said Montana State Trooper Jade Shope. "It's a winding road going through a canyon, with no shoulder," Shope told CNNMoney.
> 
> Neither Pang nor his passenger were injured in the accident, but it was serious enough that the car lost its front passenger side wheel. Pang told Shope he was driving between 55 and 60 mph on a road with a 55 mph speed limit. He told CNNMoney he had just gotten off of I-90 and was driving for a couple of minutes on the narrow road right before the accident. The car veered to the right and hit a series of wooden stakes on the side of the road. Both Pang and Tesla confirmed that the car was in Autopilot mode, and that he did not have his hands on the wheel.





> In a statement, Tesla said otherwise. "As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel," said Tesla. "He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway."


[source]

I thought that auto pilot was disabled when not on a divided roadway?

This accident reminds me of the Pennsylvania accident where the transition from car driving to human driving was unsuccessful.


----------



## James Long

*How Tesla and Elon Musk Exaggeraged Safety Claims About Autopilot and Cars*
_The autonomous program isn't meant for most types of driving, and the automaker compares its new luxury vehicles to older, cheaper cars._

After years of hype about its autonomous driving system, the facts are coming out about how safe Tesla and Autopilot really are.

Elon Musk's company admits three of its vehicles have crashed while Autopilot was engaged, including one fatal accident in which Joshua Brown's vehicle ran directly into a semi truck. In the wake of Brown's death, Musk claimed Autopilot would save thousands of lives if it was deployed universally today. That bold claim is based on insufficient data and flawed comparisons between Tesla vehicles and all other cars that stack the deck in favor of the company.
...
In the wake of Brown's fatal crash, Tesla's sensor supplier Mobileye clarified that its current technology is not designed to prevent a crash with laterally moving traffic like the turning semi truck Brown's Model S struck. This week, Tesla revealed another Autopilot accident that saw a Model X swerve into wooden stakes going 55 mph in a canyon road.
...
Rather than waiting for LIDAR costs to come down or building in a complex driver alertness monitoring system, Tesla has chosen to blame its faithful beta testers for any problems that pop up in testing. One Tesla owner describes this Catch-22, after being told that a crash was her fault because she turned off Autopilot by hitting the brakes: "So if you don't brake, it's your fault because you weren't paying attention," she told The Wall Street Journal. "And if you do brake, it's your fault because you were driving."

http://www.thedailybeast.com/articles/2016/07/14/why-tesla-s-cars-and-autopilot-aren-t-as-safe-as-elon-musk-claims.html

There are further details about how Tesla's claims are exaggerated in the article.


----------



## dennisj00

The Daily Beast! I'm glad you believe everything you read on the internet.

I'm going out to buy a couple of horses and a buggy.


----------



## trh

One part of the article seems to contradict other articles I've read. The Daily Beast article said:



> Brown clearly believed that Autopilot was "autonomous" and described it as such in the description of a video that Musk shared on Twitter.


Yet the NY Times says this about Brown's understanding of AP:



> Mr. Brown was particularly interested in testing the limits of the Autopilot function, documenting how the vehicle would react in blind spots, going around curves and other more challenging situations.
> 
> "This section in here is going to be very, very difficult for the car to handle," he said in one video, posted in October, as his vehicle rounded a curve. "We're filming this just so you can see scenarios where the car does not do well."
> 
> Mark Vernon, a high school classmate who recalled tinkering with electronics in shop class together, said that his friend showed off the self-driving feature on a recent visit at Mr. Brown's home.
> 
> "He knew the hill that it would give up on, because it couldn't see far enough," Mr. Vernon said. "He knew all the limitations that it would find and he really knew how it was supposed to work."


Source: http://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html?_r=0


----------



## Tom Robertson

James Long said:


> My rebuke of Tesla is that the feature was able to be engaged.
> 
> I believe they should have withdrawn the feature completely when they discovered people abusing the feature.


Exactly.

The feature relies on human capabilities that are not natural for many to most people--staying as attentive with AP driving as they would if they were driving.

Peace,
Tom


----------



## Tom Robertson

dennisj00 said:


> With this opinion, we'd still be driving horse-drawn buggies.
> 
> Again, remove seat belts, some people don't use them.


Your logic is backward. You might have better traction with a very few people get harmed because they wore seatbelts...

Yet even so, the question is really about human nature. AP relies on humans doing something beyond their normal capability. Such a feature would normally be disabled in any other endeavor.

Peace,
Tom


----------



## Tom Robertson

dennisj00 said:


> The Daily Beast! I'm glad you believe everything you read on the internet.
> 
> I'm going out to buy a couple of horses and a buggy.


So your whole argument is the source is to be questioned?

Ok, let's go to the horse's mouth itself: http://www.streetinsider.com/Corporate+News/Mobileye+(MBLY)+Issues+Statement+on+Fatal+Tesla+(TSLA)+Model+S+Autopilot+Crash/11793789.html



> _"We have read the account of what happened in this case. Today's collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."_


The manufacturer of the sensor makes it clear--that technology can't see events that lead to Mr. Brown's death. (Thanks to the dailybeast for a citation to their source.)

Peace,
Tom


----------



## trh

Mobileye updated that statement.



> Mobileyes' Galves later clarified that his original statement wasn't meant to comment on "the capability of the overall system that Tesla has designed," but rather just Mobileye's own capabilities.


There are 12 different systems integrated into AP; one of which is Mobileye.


----------



## James Long

trh said:


> There are 12 different systems integrated into AP; one of which is Mobileye.


Are any of the 12 "Lateral Turn Across Path (LTAP) detection"?


----------



## dennisj00

Tom Robertson said:


> Exactly.
> 
> The feature relies on human capabilities that are not natural for many to most people--staying as attentive with AP driving as they would if they were driving.
> 
> Peace,
> Tom


So, based on the folly of a few, keep the technology advances from everyone?

I don't understand your 'human capabilities that are not natural for many to most people'. Riding a bicycle is not natural to humans, but we put our kids on bikes to learn pretty much on their own. And many people continue to be killed every year riding bicycles.

Flying is not natural to humans, But with a couple of days of ground school and 40 hours of flight time, you too can risk your life (and others) in a plane. Hundreds of people are killed each year depending on the crew up front and an autopilot. And they have hours of training required every year.

Was the development of the airplane stopped with the Wright brothers first casualty? And hundreds of people continue to be killed every year.

Driving is not natural to humans. Yet we put our teenagers in a few hours of class, a few days behind the wheel and turn them loose on the roads. Some states require some additional things like daylight hours only or several months with an adult present but many don't. And many hundreds are killed and kill others on the roads every year.

And don't forget cellphones while driving . . . more technology that should have been abandoned, right?

And don't get me started on guns and gun owners. . .

It's been a busy week and I was amazed to see no negative posts in 3 days! But I did notice that Mercedes is advertising their auto driving - and they even use the word 'autonomous' - in the E Class 2017 sedans. So you guys will have another target to rail about.

I've always been on the cutting edge with technology and gadgets and with the right training and practice, no blood or injuries. (Edit: I have fallen on my bike a few times with a little blood - and most were my fault.)


----------



## yosoyellobo

Henry Ford would not have been able to get his tin Lizzie on the road with today's mentalities.


Ps I remember telling him so at that time.


----------



## Tom Robertson

dennisj00 said:


> So, based on the folly of a few, keep the technology advances from everyone?


Nope. That is not at all what I'm saying. (Nor is it "the folly of a few", unless you count Tesla as the few.)



dennisj00 said:


> I don't understand your 'human capabilities that are not natural for many to most people'. Riding a bicycle is not natural to humans, but we put our kids on bikes to learn pretty much on their own. And many people continue to be killed every year riding bicycles.
> 
> Flying is not natural to humans, But with a couple of days of ground school and 40 hours of flight time, you too can risk your life (and others) in a plane. Hundreds of people are killed each year depending on the crew up front and an autopilot. And they have hours of training required every year.
> 
> Was the development of the airplane stopped with the Wright brothers first casualty? And hundreds of people continue to be killed every year.
> 
> Driving is not natural to humans. Yet we put our teenagers in a few hours of class, a few days behind the wheel and turn them loose on the roads. Some states require some additional things like daylight hours only or several months with an adult present but many don't. And many hundreds are killed and kill others on the roads every year.


Clearly I haven't explained my position to where you understand it.

Driving today, flying, bicycles, etc. are all examples of technologies that adapted themselves to human capabilities. No longer do cars have fixed seats and only hand controls--drivers (and pilots) adjust the seat to their capabilities as a human--arm length, height, ,etc. Some cars let you adjust the steering wheel to fit the driver. Some cars let you adjust the driver's seat up and down, as well as tilt, forward/back, etc.

As the driver you are not expected to do things beyond your human capability--they've added power steering, power brakes, etc. You don't have to be a muscleman to drive a car. You don't have to be right or left handed--anyone can drive.

Driving itself is not natural, yet it is within the realm of capabilities of the bulk of humans.

My point is "paying close attention" when someone or something else is driving is not with the capabilities of most humans. Most people can't sit and watch paint dry without zoning out. Perhaps you can watch attentively for long periods of time; I'm sure some people can. But my belief is most can't.

Even if I'm wrong on the most/many/lots scale, I contend there are many enough that can't pay attention (far more than a folly of few), that any safety system that relies on normal people to stay awake is itself a folly. Look at how many people can't pay attention when they are the one driving. Do you think more of them will be able to pay attention when they aren't driving?

I think you also have a wrong impression of where I stand on the technology as a promise. I'm very actively in favor of the autonomous driving technology. The bulk of my posts are highly in favor of what will come.

What I do not want are wide scale beta tests of preliminary technology. Can you imagine the Wright Brothers plane in the hands of all families? No; it was not ready.

Likewise semi-autonomous driving is not ready for beta testing for the many. Absolutely, keep developing it. Absolutely, make it able to be ready for most people. But don't put it in the hands of people until it is ready.

Peace,
Tom


----------



## dennisj00

Tom Robertson said:


> Nope. That is not at all what I'm saying. (Nor is it "the folly of a few", unless you count Tesla as the few.)
> 
> Clearly I haven't explained my position to where you understand it.
> 
> Driving today, flying, bicycles, etc. are all examples of technologies that adapted themselves to human capabilities. No longer do cars have fixed seats and only hand controls--drivers (and pilots) adjust the seat to their capabilities as a human--arm length, height, ,etc. Some cars let you adjust the steering wheel to fit the driver. Some cars let you adjust the driver's seat up and down, as well as tilt, forward/back, etc.
> 
> As the driver you are not expected to do things beyond your human capability--they've added power steering, power brakes, etc. You don't have to be a muscleman to drive a car. You don't have to be right or left handed--anyone can drive.
> 
> Driving itself is not natural, yet it is within the realm of capabilities of the bulk of humans.
> 
> My point is "paying close attention" when someone or something else is driving is not with the capabilities of most humans. Most people can't sit and watch paint dry without zoning out. Perhaps you can watch attentively for long periods of time; I'm sure some people can. But my belief is most can't.
> 
> Even if I'm wrong on the most/many/lots scale, I contend there are many enough that can't pay attention (far more than a folly of few), that any safety system that relies on normal people to stay awake is itself a folly. Look at how many people can't pay attention when they are the one driving. Do you think more of them will be able to pay attention when they aren't driving?
> 
> I think you also have a wrong impression of where I stand on the technology as a promise. I'm very actively in favor of the autonomous driving technology. The bulk of my posts are highly in favor of what will come.
> 
> What I do not want are wide scale beta tests of preliminary technology. Can you imagine the Wright Brothers plane in the hands of all families? No; it was not ready.
> 
> Likewise semi-autonomous driving is not ready for beta testing for the many. Absolutely, keep developing it. Absolutely, make it able to be ready for most people. But don't put it in the hands of people until it is ready.
> 
> Peace,
> Tom


This pains me but . . . if I can adjust my seat, get more comfortable, have power steering, it's a problem and causes me to pay less attention to my driving?

I've used cruise control since 1972 as a third party add on and I've never rear-ended a car because I haven't paid attention.

So 'paying close attention' becomes secondary? Maybe in your world but maybe your drivers need more training.

If you're not beta testing it, why worry about it? You have a better chance of slipping in the shower and dying.


----------



## Tom Robertson

yosoyellobo said:


> Henry Ford would not have been able to get his tin Lizzie on the road with today's mentalities.
> 
> Ps I remember telling him so at that time.


TIn Lizzie's couldn't do 100 mph until they put in some basic safety features... 

Besides, if you include me in that list of "today's mentalities", I think my overall posts, typically highly in favor of a well developed autonomous car speak for themselves.

Peace,
Tom


----------



## Tom Robertson

dennisj00 said:


> This pains me but . . . if I can adjust my seat, get more comfortable, have power steering, it's a problem and causes me to pay less attention to my driving?
> 
> I've used cruise control since 1972 as a third party add on and I've never rear-ended a car because I haven't paid attention.
> 
> So 'paying close attention' becomes secondary? Maybe in your world but maybe your drivers need more training.
> 
> If you're not beta testing it, why worry about it? You have a better chance of slipping in the shower and dying.


There is a big, big difference between comfort and safety. My wife might be able to drive a car set for my son. But she can't do it safely. She would be sliding down in the seat to reach the controls.

Each of us has a level of ability when it comes to paying attention. It is not an all or nothing thing, it is a scale including many shades of gray between black and white.

Maybe you personally have the ability to pay enough attention to a semi-autonomous car after 1,0000 miles, hundreds of trips. I suspect there are people like that. I can't. I want to read, watch TV, sleep, something. I can't watch paint dry; I can't pay attention when someone else is driving. I'm not a safety system for my car. It is a safety system for me.

My belief, based on poorly people drive today when they are forced to engage, is that given the opportunity to disengage because of a semi-autonomous driver, they won't pay attention. Google proved that in their experiments. Why can't you learn from their experiences?

As for "if I'm not beta testing it..." Isn't everyone on the road with Telsas part of the overall beta test? If they have an accident, is the driver the only one affected? Of course not. Tesla is negligent in their beta testing of something that isn't close to ready.

Peace,
Tom


----------



## dennisj00

Tom Robertson said:


> There is a big, big difference between comfort and safety. My wife might be able to drive a car set for my son. But she can't do it safely. She would be sliding down in the seat to reach the controls.
> 
> Each of us has a level of ability when it comes to paying attention. It is not an all or nothing thing, it is a scale including many shades of gray between black and white.
> 
> Maybe you personally have the ability to pay enough attention to a semi-autonomous car after 1,0000 miles, hundreds of trips. I suspect there are people like that. I can't. I want to read, watch TV, sleep, something. I can't watch paint dry; I can't pay attention when someone else is driving. I'm not a safety system for my car. It is a safety system for me.
> 
> My belief, based on poorly people drive today when they are forced to engage, is that given the opportunity to disengage because of a semi-autonomous driver, they won't pay attention. Google proved that in their experiments. Why can't you learn from their experiences?
> 
> As for "if I'm not beta testing it..." Isn't everyone on the road with Telsas part of the overall beta test? If they have an accident, is the driver the only one affected? Of course not. Tesla is negligent in their beta testing of something that isn't close to ready.
> 
> Peace,
> Tom


Tom,

There have been many cars in the last 10 years that set the seat and controls to the individual driver. Teslas also do this. It's called driver profile.

I don't know why you have such a low expectation of drivers in general. You should be worried about duis and cell-phone distraction. It's a much bigger per-centage.


----------



## Tom Robertson

dennisj00 said:


> Tom,
> 
> There have been many cars in the last 10 years that set the seat and controls to the individual driver. Teslas also do this. It's called driver profile.
> 
> I don't know why you have such a low expectation of drivers in general. You should be worried about duis and cell-phone distraction. It's a much bigger per-centage.


Of course cars can adjust to the driver. That is called adapting to the human capabilities for reach, height, etc. And is an example in a discussion.

Likewise, I don't see how you think people who are already distracted by phones, movies, etc. are going to be attentive when they don't even have to drive. Surely you don't think they will be more attentive.

Let's see how attentive you are after a couple thousand miles of AP.

Peace,
Tom


----------



## James Long

dennisj00 said:


> So 'paying close attention' becomes secondary? Maybe in your world but maybe your drivers need more training.


I have not checked to see if there has been another reported accident. Most of the ones' discussed in this thread have a common point of failure - the transition from auto drive to manual control. That transition does not exist on cars without auto drive.

Drivers can lose attention in a manual drive car ... but those moments are always dangerous. Think of them as putting your hand on a hot burner on the stove. It is always a bad idea. But in an auto drive car a false sense of security is created. You can lose attention without suffering the consequences - most of the time. You put your hand on the burner and you don't get burned ... and after a while the human learns that the car will save them. Until it doesn't.



dennisj00 said:


> If you're not beta testing it, why worry about it? You have a better chance of slipping in the shower and dying.


So far nobody outside of a Tesla has been killed in an autodrive crash. When that unfortunate event eventually happens the discussion won't be about beta testers risking their own lives.


----------



## billsharpe

I try to avoid beta testing computer programs because of the possibility of the computer crashing.

Personally I will avoid beta testing "autonomous" cars for the same reason.


----------



## trh

James Long said:


> So far nobody outside of a Tesla has been killed in an autodrive crash. When that unfortunate event eventually happens the discussion won't be about beta testers risking their own lives.


And so far the only person killed driving a Tesla in AP was because a truck diver failed to yield.


----------



## inkahauts

James Long said:


> My rebuke of Tesla is that the feature was able to be engaged.
> 
> I believe they should have withdrawn the feature completely when they discovered people abusing the feature.


Ok well they should rip out all corvette engines and replace them with Honda Civic engines too then. That's the exact same argument. Stating what should happen because of what one person may have done but the vast majority probably don't.

A few people abusing a system is not a reason to yank it if it can help with safety overall.


----------



## inkahauts

Tom Robertson said:


> Exactly.
> 
> The feature relies on human capabilities that are not natural for many to most people--staying as attentive with AP driving as they would if they were driving.
> 
> Peace,
> Tom


Driving a car in general Isn't natural at all. It's a learned skill. Same goes for tesla auto Pilot.


----------



## inkahauts

Tom Robertson said:


> Your logic is backward. You might have better traction with a very few people get harmed because they wore seatbelts...
> 
> Yet even so, the question is really about human nature. AP relies on humans doing something beyond their normal capability. Such a feature would normally be disabled in any other endeavor.
> 
> Peace,
> Tom


I have a grandmother that absolutely refuses to wear a lap seat belt in the middle seat of a car. She says she's heard of way to many people that have gotten very severe I juries from them in an accident. Says she hates the regular ones too because of all the shoulder injuries they cause.

So evidently there are plenty of studies according to her.

Truth be told neither none of any of our arguments can be fully certain a nd covered by any real facts until these cars have been on the road for ten plus years and we can see some real longevity statistics .

Difference is some are afraid it's to early and others don't.

I think it's time. I also think they should maybe do a little marketing rewording about teaching buyers a little more simply because of the optics from the nay Sayers more than anything. I haven't bought one so I don't know what they do now. But someone here's about to get one so maybe he can give us a detailed explanation of what he's told when he picks up his car.


----------



## inkahauts

James Long said:


> I have not checked to see if there has been another reported accident. Most of the ones' discussed in this thread have a common point of failure - the transition from auto drive to manual control. That transition does not exist on cars without auto drive.
> 
> Drivers can lose attention in a manual drive car ... but those moments are always dangerous. Think of them as putting your hand on a hot burner on the stove. It is always a bad idea. But in an auto drive car a false sense of security is created. You can lose attention without suffering the consequences - most of the time. You put your hand on the burner and you don't get burned ... and after a while the human learns that the car will save them. Until it doesn't.
> 
> So far nobody outside of a Tesla has been killed in an autodrive crash. When that unfortunate event eventually happens the discussion won't be about beta testers risking their own lives.


The concept of transition from autondrive to human drive has been around for ages. It's always been there for crises control.

I'd like to hear how the transition can work in a tesla.


----------



## James Long

trh said:


> And so far the only person killed driving a Tesla in AP was because a truck diver failed to yield.


Call NHTSA and tell them to cancel their inquiry. I doubt that they will since they care about ALL the factors involved in the incident.



inkahauts said:


> Ok well they should rip out all corvette engines and replace them with Honda Civic engines too then.


It wouldn't be the Internet without people introducing an extreme unrelated argument. The discussion of overpowered cars would be good for another thread. It is just a strawman distraction in this thread.



inkahauts said:


> A few people abusing a system is not a reason to yank it if it can help with safety overall.


Auto drive is not a safety feature unless the driver is distracted and would crash without the feature. Anyone who calls it a safety feature is agreeing that the feature encourages people that do not pay attention to their driving to (over time) pay less attention.



inkahauts said:


> The concept of transition from autondrive to human drive has been around for ages. It's always been there for crises control.


Ages? You make it sound like autodrive has been around for a long time. It hasn't. The closest we have had is cruise control where a speed is set and cancelled by the driver with the driver controlling when and where activation and deactivation takes place. Newer cars with collision avoidance and automatic braking may brake and take the vehicle out of cruise control ... but even those features are relatively new.

What "autodrive" are you claiming has been around for ages? How many years?



inkahauts said:


> I'd like to hear how the transition can work in a tesla.


Read the accident reports in this thread ... that is how the transition CAN and has occurred. Tesla claims that the cars warned the drivers but the drivers do not recall a warning. When the Tesla gave up driving the drivers were not ready - the cars veered off the road. Whether that veering was done by the Tesla under control or a natural drift because nobody was driving is a matter of opinion. Tesla claims that autodrive was off at the moments of impact - disavowing responsibility. (Even though autodrive was on seconds before the impacts and certainly played a part in how the car was being driven.)

The shell game described by a Tesla owner:


> One Tesla owner describes this Catch-22, after being told that a crash was her fault because she turned off Autopilot by hitting the brakes: "So if you don't brake, it's your fault because you weren't paying attention," she told The Wall Street Journal. "And if you do brake, it's your fault because you were driving."


----------



## Tom Robertson

inkahauts said:


> Ok well they should rip out all corvette engines and replace them with Honda Civic engines too then. That's the exact same argument. Stating what should happen because of what one person may have done but the vast majority probably don't.
> 
> A few people abusing a system is not a reason to yank it if it can help with safety overall.


Your logic is backward. Driving recklessly with a Corvette's power is a pro-active process. You are controlling the action to abuse the rules.

AP is a passive condition. You can activate it and go to sleep. There is no need for conscious, active decision to drive recklessly by using AP.

Peace,
Tom


----------



## Tom Robertson

inkahauts said:


> Driving a car in general Isn't natural at all. It's a learned skill. Same goes for tesla auto Pilot.


There is another difference here. Driving isn't natural but the car is engineered to within human capacities. The seat can be set so your arms reach the steering wheel. The mechanics are arranged so a normal person using normal strength can turn the wheel.

What I am saying is the safety of AP relies on something that is not within normal human capability. To be attentive while watching paint dry. Thus the feature is reckless and negligently designed.

How few people pay attention now, when they are actively engaged? Do you think they will be more attentive when they aren't so actively engaged in the driving?

Peace,
Tom


----------



## inkahauts

James Long said:


> Call NHTSA and tell them to cancel their inqu
> 
> Auto drive is not a safety feature unless the driver is distracted and would crash without the feature. Anyone who calls it a safety feature is agreeing that the feature encourages people that do not pay attention to their driving to (over time) pay less attention.
> 
> What "autodrive" are you claiming has been around for ages? How many years?


How about the guy that has a heart attack? Would you consider it a safety feature then? Or how about, and this is a huge one, making sure the car leaves enough distance between vehicles to stop quickly if needed and it's ability to hit the breaks or swerve in reaction much faster than an average human to avoid a sudden problem that a human doesn't see coming?

And I'm very much talking about cruise control. That's in the exact same realm as auto pilot. The latest dynamic cruise controls are much more safety featured than the original cruise control too..


----------



## inkahauts

Tom Robertson said:


> Your logic is backward. Driving recklessly with a Corvette's power is a pro-active process. You are controlling the action to abuse the rules.
> 
> AP is a passive condition. You can activate it and go to sleep. There is no need for conscious, active decision to drive recklessly by using AP.
> 
> Peace,
> Tom


Choosing to not pay attention is not proactively doing something you aren't supposed to do? Since when?


----------



## Tom Robertson

inkahauts said:


> Choosing to not pay attention is not proactively doing something you aren't supposed to do? Since when?


"Choosing to not pay attention?" How about falling asleep. That isn't choosing, that is natural. 

The proactive is to pay attention. The passive is to stop paying attention. That is the difference between driving a corvette (active) vs. letting the corvette sit (passive.)

Peace,
Tom


----------



## Tom Robertson

inkahauts said:


> How about the guy that has a heart attack? Would you consider it a safety feature then? Or how about, and this is a huge one, making sure the car leaves enough distance between vehicles to stop quickly if needed and it's ability to hit the breaks or swerve in reaction much faster than an average human to avoid a sudden problem that a human doesn't see coming?
> 
> And I'm very much talking about cruise control. That's in the exact same realm as auto pilot. The latest dynamic cruise controls are much more safety featured than the original cruise control too..


When was the last time cruise control alerted you to take over? 

Peace,
Tom


----------



## inkahauts

Tom Robertson said:


> There is another difference here. Driving isn't natural but the car is engineered to within human capacities. The seat can be set so your arms reach the steering wheel. The mechanics are arranged so a normal person using normal strength can turn the wheel.
> 
> What I am saying is the safety of AP relies on something that is not within normal human capability. To be attentive while watching paint dry. Thus the feature is reckless and negligently designed.
> 
> How few people pay attention now, when they are actively engaged? Do you think they will be more attentive when they aren't so actively engaged in the driving?
> 
> Peace,
> Tom


I really don't get how paying attention to what the car is doing while on auto pilot is like watching paint dry. You have never been in a car with a back seat driver I guess. Someone who isn't even driving but warns you about everything on the road. I just don't by into this argument of paint drying. Constant movement in front of you along the road is not paint drying.

Actually on an aside... but still a bit on topic, this is pretty funny. My cousins kids found it because they say she is her husbands little helper... she agrees too...


----------



## inkahauts

Tom Robertson said:


> When was the last time cruise control alerted you to take over?
> 
> Peace,
> Tom


Last year. I was in a dynamic crush control car. It disengaged because a semi pulled out in front of me from the right line while Driving down a freeway. It couldn't guarantee spacing. It's only happened the once but I was paying attention as I'm supposed to even though the car is driving itself speed wise. Wasn't an issue for me. It did what i felt it should have.


----------



## inkahauts

Tom Robertson said:


> "Choosing to not pay attention?" How about falling asleep. That isn't choosing, that is natural.
> 
> The proactive is to pay attention. The passive is to stop paying attention. That is the difference between driving a corvette (active) vs. letting the corvette sit (passive.)
> 
> Peace,
> Tom


If you are that tired you are supposed to pull over and stop. No different than if your car isn't self driving. You know if you are to tired to drive without auto pilot. That shouldn't change with it on either sense you are told You must still pay attention. Again that's the driver choosing to be stupid and drive when they shouldn't be on the road at all and know it. Sleepiness doesn't just come Out of nowhere.


----------



## dennisj00

Tom Robertson said:


> Your logic is backward. Driving recklessly with a Corvette's power is a pro-active process. You are controlling the action to abuse the rules.
> 
> AP is a passive condition. *You can activate it and go to sleep*. There is no need for conscious, active decision to drive recklessly by using AP.
> 
> Peace,
> Tom


Again, NO experience with AP. I invite you guys to go and take a test drive, it's free. There are many warnings / alerts to keep your hands on the wheel and probably keep you more alert than any conventional cruise control.

Try it before you publish your opinions. Opinions are cheap.


----------



## James Long

inkahauts said:


> How about the guy that has a heart attack? Would you consider it a safety feature then?


Have you been reading? A "correctly operating" Tesla would detect that the driver was no longer controlling the vehicle (hand detected on the steering wheel) give a warning to the driver and then stop driving - leaving the car with no driver, human or automatic. Traveling in whatever direction and speed it was traveling in until it crashes.

If the vehicle was programmed to activate hazard lights and make a safe lane change to the shoulder then stop without causing an accident perhaps one could call it a safety feature. But that is not what the Tesla is programmed to do. The Tesla gives up and lets the car crash. Perhaps a few seconds later than if auto pilot were off, but a crash is a crash. Is that your idea of safe?



inkahauts said:


> And I'm very much talking about cruise control. That's in the exact same realm as auto pilot. The latest dynamic cruise controls are much more safety featured than the original cruise control too..


Cruise control is not auto pilot. One must still maintain lane control oneself. Without Tesla style auto pilot the best one can hope for is working lane drift detection. Speed controls that prevent following too close and slow the vehicle to the proper following distance are relatively new.

The more driving one takes out of the hands of the driver the less the driver has to pay attention to the road.


----------



## Rich

dennisj00 said:


> _*Again, NO experience with AP.*_ I invite you guys to go and take a test drive, it's free. There are many warnings / alerts to keep your hands on the wheel and probably keep you more alert than any conventional cruise control.
> 
> Try it before you publish your opinions. Opinions are cheap.


This is getting like the 4K debates. Not having a 4K set and arguing about the positive and negative aspects of the 4Ks has become a frequent annoyance, much like your experiences with your new Tesla and this thread. I understand your frustration. Your experiences with the Tesla will be valued. Keep on truckin'.

Rich


----------



## Rich

Tom Robertson said:


> "Choosing to not pay attention?" _*How about falling asleep. That isn't choosing, that is natural.*_
> 
> The proactive is to pay attention. The passive is to stop paying attention. That is the difference between driving a corvette (active) vs. letting the corvette sit (passive.)
> 
> Peace,
> Tom


Hmm. That's rather unsettling...

Rich


----------



## dennisj00

Interesting comments from Clark Howard, the usually frugal financial (?) advisor from Atlanta. . . http://www.clark.com/clarks-take-tesla-autopilot

He has over a year experience with AP and like me, very interested in participating in the development.

I've had my Model S for 26 hours and WOW, what a car. Another 50 miles or so on cruise and AP. I still say it's like getting use to someone in the drivers seat as a passenger. I never like being in the passenger seat with most any driver, but after a while I get use to their technique.

It's absolutely the safest car I've ever owned (or the bank has owned!)


----------



## James Long

dennisj00 said:


> It's absolutely the safest car I've ever owned (or the bank has owned!)


I do not know what you have owned before ... but even without autodrive cars are getting safer each year. (The driver death rate for all 2011-2010-2009 model year vehicles is 28 per million miles. The driver death rate for all 2008-2007-2006 model year vehicles is 48 per million miles. [source])


----------



## James Long

The preliminary report for the fatal Florida crash:
View attachment HWY16FH018-Preliminary-Report.pdf

http://www.ntsb.gov/investigations/AccidentReports/Pages/HWY16FH018-preliminary.aspx

"Tesla system performance data downloaded from the car indicated that vehicle speed just prior to impact was 74 mph. System performance data also revealed that the driver was operating the car using the advanced driver assistance features Traffic-Aware Cruise Control and Autosteer lane keeping assistance. The car was also equipped with automatic emergency braking that is designed to automatically apply the brakes to reduce the severity of or assist in avoiding frontal collisions."

The highway was posted 65 MPH.

"All aspects of the crash remain under investigation. The Florida Highway Patrol and Tesla Motors are parties to the ongoing investigation."

GoogleMaps link for the intersection. It looks like there is a hill crest off to the west from where the Tesla approached.
https://www.google.com/maps/place/US-27+ALT,+Williston,+FL+32696/@29.4105962,-82.5397624,3a,60y,304h,90t/data=!3m6!1e1!3m4!1s4dT1SiBOgr7L9DhfmfHADw!2e0!7i13312!8i6656!4m5!3m4!1s0x88e925b43927a579:0x938a757ee1ceaf59!8m2!3d29.4055975!4d-82.5219704


----------



## trh

Thanks for posting that. Now we at least have the speed at impact.


----------



## dennisj00

James Long said:


> I do not know what you have owned before ... but even without autodrive cars are getting safer each year. (The driver death rate for all 2011-2010-2009 model year vehicles is 28 per million miles. The driver death rate for all 2008-2007-2006 model year vehicles is 48 per million miles. [source])


So your point is?? Currently on 130 million miles - 1 death? And it's 2016 not 2008-2006.


----------



## James Long

dennisj00 said:


> So your point is?? Currently on 130 million miles - 1 death? And it's 2016 not 2008-2006.


The point is (overall) a 2011 car is safer than a 2008 car. I stated my point in my post ... cars get safer every year. (Perhaps knee jerk reactions are easier than reading the posts you respond to?)

Numbers for newer model years will be released when they are statistically significant. The numbers Tesla has released are not yet statistically significant. (The numbers from 2011 were based on 62.9 million registered vehicle years. The numbers from 2008 were based on 65 million registered vehicle years.)


----------



## dennisj00

Post again when you can compare 130 MILLION miles under AP.


----------



## James Long

dennisj00 said:


> Post again when you can compare 130 MILLION miles under AP.


I will post as needed. I expect one of my posts will be about Tesla's next fatal accident and it won't be long until the fatality is not in the Tesla.

Drive safe ... and don't fall in to the trap Clark Howard reported:
"In his personal experience, Clark admits that it's easy to become lackadaisical when using the Autopilot feature because it works so well."
http://www.clark.com/clarks-take-tesla-autopilot

Unfortunately Mr Howard is not just "taking his chances" with his own health and safety. He is on public roads.


----------



## Tom Robertson

dennisj00 said:


> Post again when you can compare 130 MILLION miles under AP.


How about this for a comparison: https://forums.tesla.com/forum/forums/autopilot-safety-and-epidemiological-studies

We need well formed data, not cherry picked marketeer data. And then the data needs to be compared correctly against the proper cohorts. 130 million highway miles, in perfect good driving conditions vs. a similar dataset of other cars--highway miles, perfect good driving conditions, etc.

Peace,
Tom

Edit: decided I over stated the driving conditions. While AP won't be used in poor driving conditions, they needn't be perfect.


----------



## James Long

Tom Robertson said:


> ... AP won't be used in poor driving conditions ...


I wouldn't bet on that. I have seen YouTube.


----------



## Tom Robertson

James Long said:


> I wouldn't bet on that. I have seen YouTube.


True, though I understand AP shuts itself down. 

Peace,
Tom


----------



## dennisj00

Eight days and a few hours, 400+ miles and it's a wonderful car! Certainly the safest car available in the US.

(My opinion so you can post your opinion).

In those 400 miles, they've been mostly under normal driving and the TACC (Traffic Aware Cruise Control) and a few under AP.

Getting use to the car in general, as with any different car (that's mine instead of someone else's or a rental) I've been very careful. But the TACC is incredible. You have to be moving more than 18 mph to engage it but it follows the traffic ahead - particularly stop and go - very well. Very useful in stop and go congestion.

It follows at the spacing you've set - 1 to 7 car lengths - and at the posted speed or the speed you've selected, but reduced if necessary to follow, and brakes and stops behind the car in front very nicely and starts up when the chain moves.

AP does a great job for the miles I've used it. I'll wait until some longer trips to review it.

It's the most fun car I've driven and that's included 911, Boxters and Miatas, some I've owned and others not. Stopping at the SuperChargers (free) is also a social event!

So far, I've charged about $7.00 at home of the 400+ miles.


----------



## Tom Robertson

This doesn't sound good, Chris Urmson (and 2 others) are leaving Google: http://www.recode.net/2016/8/5/12388496/tech-lead-of-google-s-self-driving-car-project-is-leaving

It could as simple as the transition from R&D project to actual building cars is too big a culture shift; perhaps not bad news at all.

And it is not uncommon for execs to leave to start up a similar project.

But it sure doesn't sound good at this time.

Peace,
Tom


----------



## inkahauts

Actually I see it as a positive for several reasons. For one it means you have to bring in some new blood and sometimes new blood can fix or update things faster than someone who's been staring at the same thing for years.

Secondly I bet that at least one of these if not all three of them end up starting a new company doing similar things. To me the more companies that start building automated cars the better.

That will drive everyone to work harder at it and to be one of the first ones out there with a solid product.

I wonder if we'd see the BMW electric car if it wasn't for tesla. And all the other high end ones and low end ones coming and announced over the next five years. Once you have a ground swell you get more pressure to excel and more people trying to.

It does seem a little odd they have lost so many people so quickly but I have to ponder what the underlying reasons could be for that.


----------



## Tom Robertson

inkahauts said:


> Actually I see it as a positive for several reasons. For one it means you have to bring in some new blood and sometimes new blood can fix or update things faster than someone who's been staring at the same thing for years.
> 
> Secondly I bet that at least one of these if not all three of them end up starting a new company doing similar things. To me the more companies that start building automated cars the better.
> 
> That will drive everyone to work harder at it and to be one of the first ones out there with a solid product.
> 
> I wonder if we'd see the BMW electric car if it wasn't for tesla. And all the other high end ones and low end ones coming and announced over the next five years. Once you have a ground swell you get more pressure to excel and more people trying to.
> 
> It does seem a little odd they have lost so many people so quickly but I have to ponder what the underlying reasons could be for that.


Yeah, I do see the huge potential gain for the industry overall. A couple of the execs leaving Google have announced they are sticking with self-driving vehicles in their new startup. I should have more clearly limited my concerns to the Google aspects, as the overall industry is still exciting. 

And even within Google, the "problems" could be positive growing pains as they shift from R&D to manufacturing. Growing pains can be very good. 

Peace,
Tom


----------



## dennisj00

Almost at 1000 miles in almost 3 weeks. Took our first 'road trip' this week just to check out the range predictions and actual mileage.

Left the Charlotte SuperCharger with 190 miles of range and a prediction of 50 miles reserve when we arrived at the Asheville SuperCharger. I was amazed that no matter if I was doing 60, 70 or 80 mph, it still tracked very accurately. As I changed routes the reserve went to 60 and we actually arrived at the Asheville SC with 63 miles in reserve. 128 miles at 278 whr/mile. 

That's 4.3 CENTS per mile if I were paying for it. But it's FREE!

Returned after charging for 42 minutes - walking through the outlet mall during charging - on I-40 back to home . . . 122 miles at 259 whr/ mile! There's about an 8 mile downhill on Black Mountain that I saw the range increase by 6 miles!

Other than lunch, cost of the trip, $0.00.

The TACC is phenomenal, and reacts much better than most drivers. Whether at 70 mph or stop and go driving, it keeps the distance ahead that you set and comes to a complete stop behind the car ahead and starts back as necessary. It's really good in stop and go driving. I drove for many miles without touching either pedal.

While I did 30 or so miles in both legs with AP on, I like the TACC better with me steering. The AP tends to favor the middle of the road while I like to favor one side or the other, particularly when there's a car or truck on either side. It would be nice to be able to fine tune it to the right or left of the lane, however overall, it does a great job. With more software tweaks, it can only get better.

While I've been an EV driver for almost 5 years with the Leafs, there was always a bigger drain on the mileage when above 50 miles per hour. That hasn't been the case with the Model S even with miles up to 80 mph!! I've been amazed at how accurate the range / tracking has been.

In the first 1000 miles, I've charged at home for about $12.00.


----------



## inkahauts

Sounds pretty good so far to me!


----------



## trh

When you charge at a SuperCharger, who is paying for it? Tesla?


----------



## dennisj00

trh said:


> When you charge at a SuperCharger, who is paying for it? Tesla?


Yes. Some are adding solar panels.

And the $12 I mention isn't discounted by my solar panels.


----------



## inkahauts

trh said:


> When you charge at a SuperCharger, who is paying for it? Tesla?


Yeah superchargers are free to x and s models. Evidently the 3 will cost per use or pay one time extra fee for lifetime.

I never have heard if the roadster can use them.


----------



## Drucifer

A smart vehicle will work faultlessly on roads designed for them.

With current tech, getting a smart vehicle to work without any errors on normal roads is an impossibility.

That is a fact of life.

The industry needs to do a lot more research on how we can make today roads smart vehicle friendly.


----------



## James Long

Lost in translation? Perhaps. But there was another Tesla autodrive related accident, this one in China.


> Zhen blames Tesla's Beijing store for the accident, telling the media that he was told by the store's staff that the car is "self-driving." Zhen indicated that he had been using Autopilot for more than a month and, at the time, had a dashcam running that recorded the crash.
> 
> "The impression they give everyone is that this is self-driving; this isn't assisted driving," the driver told reporters from Reuters.
> 
> *Tesla's website in China advertises the Model S using the term "zidong jiashi," which translates to "self-driving,"* Reuters notes, though the automaker disputes that it promotes its cars and the Autopilot feature in such a manner.


Emphasis added.
http://autoweek.com/article/technology/driver-china-autopilot-crash-blames-teslas-self-driving-pitch

The driver admits to being distracted ... looking down at his cell phone or navigation and looking up every few seconds. A comfort level with autopilot that I am concerned will become more common.

Here is a video of the story including the crash ...





The driver claimed that during the demonstration test drive the sales person took their hands off of the steering wheel and took their feet off of the accelerator and brake. So perhaps the sales team needs to learn how to drive the cars they are selling.


> Interviews with four other unconnected Tesla drivers in Beijing, Shanghai and Guangzhou also indicated the message conveyed by front-line sales staff did not match up with Tesla's more clear cut statements that the system is not "self-driving" but an advance driver assistance system (ADAS).


http://venturebeat.com/2016/08/10/tesla-model-s-in-autopilot-mode-crashes-in-china/

Apologists for Tesla will blame the driver ... but is there not a collision warning system? Shouldn't the Tesla have come to a stop behind the disabled vehicle instead of attempting to share the lane?


----------



## inkahauts

Does tesla own the stores in china like they do here?


----------



## trh

James Long said:


> Apologists for Tesla will blame the driver ... but is there not a collision warning system? Shouldn't the Tesla have come to a stop behind the disabled vehicle instead of attempting to share the lane?


Maybe the car thought it was a road sign?


----------



## inkahauts

Hey something screwed up. I'm sure they'll figure out what. But unless they report it who knows why. 

It's funny, as I recall Google had a lane sharing issues as well that resulted in an accident. I wonder if this kind of thing is a bigger issue to figure out than we would think. I'd like to know what their findings are.


----------



## dennisj00

And we certainly don't know the whole story. The kid could have grabbed the wheel in panic and side-swiped the parked car. Until the data is reviewed, we don't know the whole story. A driver's first reaction is 'it's not my fault'. Although he did admit he wasn't paying attention so it's HIS fault! (whether he admitted it or not).

Reminds me of a story one night on national news of a kid that got killed on a 4 wheeler and the mother's quote was 'I can't believe that salesman would sell me something that would kill my child!"

Nobody takes responsiblility for their own actions anymore.


----------



## dennisj00

More info. . . when you activate AP, it warns you to keep your hands on the wheel and *be ready to take over at any time*. I know it does nag you if your hands aren't on the wheel and from the manual (I haven't tried it) it will de-accelerate, put the hazard lights on and stop if you 1) unlatch the drivers seat belt or leave the seat, 2) don't put your hands on the wheel after warnings.

Don't forget, although AP might be in control, you still have control of the car speeding at something at 70 mph and if you decide to take the wrong corrective action -- it happens every day, multiple times in any car (without AP), it's not the cars fault. Leave the smartphone alone.

As I've reported first hand in the first 1000 miles, both TACC and AP have their positive features for reducing potential accidents. While I haven't had any close calls, I have had cars pull in or change lanes and the car has reacted before I would normally have.

Go take a test drive. It's an awesome car.


----------



## inkahauts

Here is a question. What if he was paying attention to other things and then moved his arms suddenly and hit the steering wheel by accident? Would that then make the car swerve and possibly hit the car that was parked? Can that happen? Just be more amo for paying attention still. 

Their website though seems to need some correcting. Problem is with culture and grammar who knows how that may have gotten messed up?


----------



## dennisj00

I'd almost think that's what happened. When you're using AP, you're hands are on the wheel but sorta sliding. If you counter the steering, it cuts off - that's one way to dis-engage it.

It tends to steer more that I do while staying in the middle of the lane. And I'd like it to favor more to the left while passing a truck in the right or vice-versa. While it does a good job, there are situations that you need to be there. So far, I don't agree that getting comfortable with it is setting you up for an accident. While there are uTubes and other reports otherwise, that's no fault of Tesla.

I'm not sure we can make any technology idiot-proof. Look at the idiots walking into the fountains in the mall while concentrating on their phones.


----------



## James Long

inkahauts said:


> It's funny, as I recall Google had a lane sharing issues as well that resulted in an accident. I wonder if this kind of thing is a bigger issue to figure out than we would think. I'd like to know what their findings are.


You are correct. Google reported an incident with a lane sharing issue (linked earlier in this thread). It was a wider lane where two vehicles could have fit between the lines. That accident is an important learning experience for Google.



dennisj00 said:


> Nobody takes responsiblility for their own actions anymore.


At least not until they are sued out of existence. Perhaps that is part of the problem. Accepting responsibility would open Tesla up to liability. Shirking responsibility means they can always blame the drivers. Some people are too stupid to own a Tesla. 

Tesla is relying on the driver to be better than their AI. Tesla is relying on the driver to take over the instant that their AI fails. When one says that the car is reacting faster than they would have reacted isn't that a warning that they are paying attention to their surroundings?

I am reminded of the quote from the previous article: "So if you don't brake, it's your fault because you weren't paying attention. And if you do brake, it's your fault because you were driving."


----------



## dennisj00

We need a "Great Spin" button! And remind me where the ignore button is.


----------



## Rich

dennisj00 said:


> And we certainly don't know the whole story. The kid could have grabbed the wheel in panic and side-swiped the parked car. Until the data is reviewed, we don't know the whole story. A driver's first reaction is 'it's not my fault'. Although he did admit he wasn't paying attention so it's HIS fault! (whether he admitted it or not).
> 
> Reminds me of a story one night on national news of a kid that got killed on a 4 wheeler and the mother's quote was 'I can't believe that salesman would sell me something that would kill my child!"
> 
> Nobody takes responsiblility for their own actions anymore.


People do some nutty things. I sold an expensive scooter to a friend of mine, an oral surgeon. He put it in his garage and it was stolen while he was out of state. I got a call from his wife, she wanted the money back. A seemingly rational woman in a time of stress just lashing out. Nutty.

Rich


----------



## Rich

dennisj00 said:


> I'd almost think that's what happened. When you're using AP, you're hands are on the wheel but sorta sliding. If you counter the steering, it cuts off - that's one way to dis-engage it.
> 
> It tends to steer more that I do while staying in the middle of the lane. And I'd like it to favor more to the left while passing a truck in the right or vice-versa. While it does a good job, there are situations that you need to be there. So far, I don't agree that getting comfortable with it is setting you up for an accident. While there are uTubes and other reports otherwise, that's no fault of Tesla.
> 
> I'm not sure we can make any technology idiot-proof. _*Look at the idiots walking into the fountains in the mall while concentrating on their phones.*_


Don't you think that cars should have a device that locks out phones while the car is in motion? I see cars doing odd things very frequently and phones are usually in use by the drivers.

Rich


----------



## dennisj00

Rich said:


> Don't you think that cars should have a device that locks out phones while the car is in motion? I see cars doing odd things very frequently and phones are usually in use by the drivers.
> 
> Rich


I certainly think the more texting oriented / visual apps (Facebook, etc.) should be locked out. I don't have any problem with phone calls, playing music via Bluetooth.

In both the Tesla and Leafs (and most any recent car with a decent audio system), I can call or answer by voice control (or buttons on the steering wheel) or "Play some Billy Joel". In both cars, some of the touch screen console functions are greyed out when moving.

And it's not just in cars. It's most distressing to have dinner with someone that has their face glued to the facebook screen rather than participate in the group conversation.


----------



## Rich

dennisj00 said:


> I certainly think the more texting oriented / visual apps (Facebook, etc.) should be locked out. I don't have any problem with phone calls, playing music via Bluetooth.
> 
> In both the Tesla and Leafs (and most any recent car with a decent audio system), I can call or answer by voice control (or buttons on the steering wheel) or "Play some Billy Joel". In both cars, some of the touch screen console functions are greyed out when moving.
> 
> And it's not just in cars. It's most distressing to have dinner with someone that has their face glued to the facebook screen rather than participate in the group conversation.


I'm used to the OnStar phone in my car, but I do find it distracting while driving. My other car has bluetooth and what you have to go thru to answer a call is very distracting. I dunno what the answer is to the cell phones in cars, I can see that blocking phones is a bit draconian, especially with passengers who might want to call or answer a call. I use an interstate highway a lot and what I see is a bit too much.

Rich


----------



## dennisj00

Rich said:


> I'm used to the OnStar phone in my car, but I do find it distracting while driving. My other car has bluetooth and what you have to go thru to answer a call is very distracting. I dunno what the answer is to the cell phones in cars, I can see that blocking phones is a bit draconian, especially with passengers who might want to call or answer a call. I use an interstate highway a lot and what I see is a bit too much.
> 
> Rich


I'm only speaking from the driver standpoint and there's certainly no good answer for passengers yet.

However, the Tesla can detect that the 'remote' is in the driver's area, so NFC may have some answers.

Example . . wife had her remote in the passenger seat . . I put the car in park, get out to grab something from the office. Car shuts down as I walk away. (we do have it set to lock and turn off on walk-away).

As I return to the car, it presents the door handles and is on and ready by the time I'm in the car. So it knows who's driving and which remote to honor.

It wouldn't hurt my feelings to block any of the Text / UI apps for everybody (facebook,etc) but I'm not that in to social media. But there would probably be rioting in the streets!


----------



## James Long

Rich said:


> I dunno what the answer is to the cell phones in cars, ...


Training. I have been driving with a cell phone for over 18 years. I was responsible for answering the phone as soon as possible or returning pages within minutes. Just like any other potentially distracting device, I learned how to operate the car as a primary function and operate the phone as a secondary function.

I wish there was some sort of endorsement one could get on their license that allowed cell phone use ... but the tide of public opinion has turned and everyone is punished because of those who made mistakes. Now I have to remember what the laws are for the state I am in before touching the phone and the great state of Indiana has made looking at pagers illegal as part of their texting law (but Facebook and email were not illegal under the last wording I saw).

Perhaps that is what we need for Tesla's autopilot. A government certification that the driver understands the risks and accepts the responsibility for the car ... just to make sure they didn't misunderstand the marketing hype.


----------



## James Long

dennisj00 said:


> It wouldn't hurt my feelings to block any of the Text / UI apps for everybody (facebook,etc) but I'm not that in to social media. But there would probably be rioting in the streets!


If the car itself was a Faraday cage that would be the customer's choice. There would probably be someone trying to cheat the system by holding their phone by the window to get maximum signal.

But if the "blocking" was an active signal then the manufacturer would need to answer to the FCC. Active blockers are illegal.


----------



## dennisj00

James Long said:


> just to make sure they didn't misunderstand the marketing hype.


Can't ignore this one!! What marketing hype?? Tesla is spending no $$ on advertising - it's people like you and media that are parroting websites and youtube videos about how great or how bad it is.

Again, NO first hand experience.


----------



## Nick

> ... just to make sure they didn't misunderstand the marketing hype.


Or worse, believed it.

I am reminded of the story of the new motor home owner who got up out of
the driver's seat to make coffee because the RV salesman had described
the cruise control feature as the 'auto-pilot' ...I'm sure most of us know how
that ended.


----------



## Drucifer

*Uber Is Betting We'll See Driverless 18-Wheelers Before Taxis*

At least one of its self-driving trucks is on the highways around the Bay Area at all hours.

by Tom Simonite

In a battered warehouse in San Francisco, Uber is working on what it thinks will be a shortcut in the race to make money from vehicles that drive themselves. A fleet of six modified white Volvo truck cabs operate out of a brick building in the SoMa district popular with technology startups. Around the clock, at least one of the vehicles is steering itself around Bay Area highways.

. . . . .

READ MORE


----------



## James Long

*Pittsburgh, your Self-Driving Uber is arriving now*






A year and a half ago, Uber set up an Advanced Technologies Center (ATC) in Pittsburgh. Its mission: to make self-driving Ubers a reality. Today, we're excited to announce that the world's first Self-Driving Ubers are now on the road in the Steel City.

We're inviting our most loyal Pittsburgh customers to experience the future first. If a Self-Driving Uber is available, we'll send it along with a safety driver up front to make sure the ride goes smoothly. Otherwise it's uberX as usual.

https://newsroom.uber.com/pittsburgh-self-driving-uber/


----------



## Stewart Vernon

Uber is doing an interesting thing... first, they setup this thing with lots of people using their own cars to drive people around... getting customers used to using the Uber service... and then, looking into motorizing their own fleet of driverless cars. IF that takes off, you can bet all the human Uber drivers will be dropped from the service as each area gets upgraded to the driverless cars. Suddenly, what appeared to be a growing job market goes to zero overnight... then they set their eyes on driving Taxis out of business next.


----------



## phrelin

While we've certainly seen some things like electronic door locks and ignition hacks, we now have *Chinese company hacks Tesla car remotely*. Everyone likes a challenge....


----------



## inkahauts

Ok so who's read the new federal regulations on autonomous cars?


----------



## James Long

inkahauts said:


> Ok so who's read the new federal regulations on autonomous cars?


"The new guidelines on Monday, which stopped short of official regulations, targeted four main areas. The Department of Transportation announced a 15-point safety standard for the design and development of autonomous vehicles; called for states to come up with uniform policies applying to driverless cars; clarified how current regulations can be applied to driverless cars; and opened the door for new regulations on the technology."

http://www.nytimes.com/2016/09/20/technology/self-driving-cars-guidelines.html?_r=0


----------



## inkahauts

I believe it's 168 pages long... that's why i asked if anyone has read it all.


----------



## James Long

116 PDF Pages - Enjoy!

https://www.transportation.gov/AV/federal-automated-vehicles-policy-september-2016


----------



## James Long

*Tesla Upgrades Autopilot in Cars on the Road*

The company has described the Autopilot update as intended to avoid the sort of errors that contributed to a Tesla owner's fatal crash in May on a Florida highway. The most evident change might be visual and audible reminders that drivers should keep their hands on the steering wheel.

It is an implicit scale-back of the company's original promotion of Autopilot, introduced last October, as essentially a self-driving car system. At the time, Elon Musk, Tesla's chief executive, said Autopilot was "probably better than a person right now."

. . .

The Autopilot update coincides with new policy guidelines for self-driving vehicles that the federal government issued this week. That policy, still subject to public review, gives the auto industry remarkable leeway in developing technology for fully autonomous vehicles. The main condition is that anything introduced for public use must be safe.

http://www.nytimes.com/2016/09/24/business/tesla-upgrades-autopilot-in-cars-on-the-road.html


----------



## phrelin

According to a number of articles available on line, Uber is trying out their self-driving Volvo's in San Francisco and Google has put their car division into a separate subsidiary corporation. This _SF Chronicle_ article covers both Uber's robot taxis come to SF as Google revamps.


----------

