Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/this-cannot-be-good-for-progress-in-autonomous-cars.8021/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

This cannot be good for progress in autonomous cars

Surprised nobody posted this yet (did I miss it?). Tesla is being investigated for a fatal crash between a Tesla driver, apparently watching a Harry Potter movie, and a semi truck. Later followed by videos of various people in Teslas in bad situations (sleeping driver, terrified Granny) in autonomous mode. Point is not whether any of these people were stupid. This is bad PR and, independent of any technical case the industry makes, is likely to lead to NHTSA and others further slow-rolling progress to autonomous vehicles. Politicians don't care about technical arguments, they care about public perception.

Sleeping driver, terrified granny among Tesla Autopilot users on YouTube
| Reuters
 
Do you remember when cruise control came out? and the joke about the guy in the RV who crashed after putting on the cruise control then going to the kitchen to make a sandwich? I never believed it was only a joke and this supports my belief.
 
Definitely. Not good for the industry.

And this almost seems half-intentional from Tesla, or at the least case,very reckless. Why reckless ? because Google have said it has tested this and it's not safe to trust humans to correct a self driving car, and the fact that the Tesla autopilot camera sometimes doesn't see white, and the radar sometimes mistakes things for road signs - should have caused caution among any serious safety-critical engineer.
 
I read at the Tesla site that this was the first fatality using Autopilot in 130 million miles, while in the USA for all vehicles there's a fatality every 94 million miles.

Yes, it was tragic that this Autopilot user died, and my condolences to the family.
 
Roger has an interesting perspective:

No Turning Back on Autonomous Driving

Personally I can't wait for all cars to be electric (quiet/clean) and autonomous. Northern California is a less safe place to drive/walk/bicycle everyday and I don't see that changing unless technology "assists" us. I just wish the State Government would (1) enforce the traffic laws we have and (2) "assist" with technology adoption. Lets face it, car companies are finally being forced into this by Google and Tesla. The Politicians should step up push even harder.
 
Last edited:
Daniel,

Tesla's declaration is a bit misleading: Those 130 million miles we're driven with a human observing and being ready to take control, and sometimes even taking control(we don't know how often), so you cannot compare it to regular driving. Maybe to driving under the eye of a driving instructor(although that's not exact too).

It's also very worrisome that Tesla plays those PR games with a life critical thing as this.
 
Daniel,

Tesla's declaration is a bit misleading: Those 130 million miles we're driven with a human observing and being ready to take control, and sometimes even taking control(we don't know how often), so you cannot compare it to regular driving. Maybe to driving under the eye of a driving instructor(although that's not exact too).

It's also very worrisome that Tesla plays those PR games with a life critical thing as this.

Two things:

(1) Tesla is a semiautonomous car not an autonomous car meaning that the driver must be in control all times.

(2) You need to look more carefully at the driver and the circumstances of the accident. Here is one of the better articles I found:

http://www.nytimes.com/2016/07/02/b...nthusiast-tested-the-limits-of-his-tesla.html

Tesla has a "black box" on board so I'm sure there will be a thorough investigation but my guess is that the two drivers will get 100% of the blame on this one.
 
I read at the Tesla site that this was the first fatality using Autopilot in 130 million miles, while in the USA for all vehicles there's a fatality every 94 million miles.

Yes, it was tragic that this Autopilot user died, and my condolences to the family.

But... how many teenagers drive Teslas? Even for mature age drivers one would expect Tesla drivers to be above average drivers.
 
But... how many teenagers drive Teslas? Even for mature age drivers one would expect Tesla drivers to be above average drivers.

I agree, except I read that this Tesla driver had 8 speeding tickets in 6 years. My four children did not have six traffic tickets amongst them much less six speeding tickets. Two of them are still on my car insurance and they have had zero tickets (ages 20 and 22). I did not buy my children high performance vehicles either so there is that.
 
This is truly amazing:

Highly anticipated Tesla Autopilot 8.0 release to roll out in July

Autopilot 8.0 is expected to download in the next few weeks to approximately 80,000 Tesla vehicles. Autopilot 8.0 will add some new features and strengthen others. The new software is considered a major update, but it will still be a hands-on-the-wheel assisted driving program that requires drivers to be able to take over control of the vehicle on a moment’s notice.
Read more:
[url]http://www.digitaltrends.com/cars/tesla-model-s-autopilot-expected-update/#ixzz4DUqFBrXX
[/URL]
My next car may in fact be a Tesla. Sorry Porsche.
 
You know what they say about making things fool-proof?

It's really hard because fools are so clever.

But seriously, anytime people are moving faster than standing still, there will be accidents. It was only a matter of time. Interestingly Tesla stock did not budge when this happened. One question I have not seen addressed is: how many times has an auto-pilot feature saved someone from an accident? Humans are notoriously bad at things like driving that require constant attention. I'd wager that even with this one death, the balance sheet probably leans to the positive side for self driving technology.

Foot note - he probably would not have died had he hit a car or bus. Instead he hit something that had a high underside clearance, preventing the nose of the car acting as an energy absorber. It sounds like the windshield and roof were sheared off. Tesla's have protected occupants in some insane collisions where the safety design features came into play.
 
The Tesla circumstances were exactly the problem Chris Rowen of Cadence was talking about in object recognition:
https://www.semiwiki.com/forum/content/5914-10-signs-neural-net-based-adas-road.html

Truck with white fifth-wheel trailer crossing the road perpendicularly. Tesla's vision system failed to recognize the trailer in the "brightly lit sky" and drove right under it. Brakes weren't ever applied. Tesla claims the driver "didn't see it either", extrapolating that from a claim that the driver took no avoidance action and a Harry Potter DVD was still playing when the car was found.

A Tragic Loss | Tesla Motors

Complete failure in sensor fusion that the vision system has priority and no secondary sensor (radar or lidar) factored in. Tesla's argument that the road height of the trailer was (almost but unfortunately not quite) above the road height of the car is weak. It's a systems engineering problem in several dimensions, usually the case in a disaster where factors typically cascade.

I do agree that most Tesla drivers would be considered above average demographically speaking, and when Autopilot is on most drivers are still attentive, so the reduced overall incident rates are not surprising. However, there was another non-fatal incident last week:

Southfield art gallery owner survives Tesla crash

We have a bright future for assist systems, but I'll stick to my position there is a dim one for full-time autonomous systems in consumer vehicles.
 
Don , on the one hand you criticize tesla's sensor system design, which ii fully agree.

On the other, you believe fully-automated cars have dim future. Why ?
 
Don , on the one hand you criticize tesla's sensor system design, which ii fully agree.

On the other, you believe fully-automated cars have dim future. Why ?

I agree with Don on the fully automated car dim future. In a hybrid environment where the majority of cars are driven manually I don't see fully autonomous happening. It really is a jungle out there with motorcycles splitting lanes and people distracted beyond belief. In Asia the scooter problem is epidemic. How do you handle swarms of lawless scooters autonomously?

If we all had fully autonomous vehicles I would say no problem at all, the technology is already happening, but how long will it take to get everyone into a fully automated car? 20 years? Semi autonomous however is a given and even if cars are fully autonomous the lawyers will probably still insist we have hands on the wheel.
 
I suppose if we run the clock forward about 3 generations, to flying cars as my son-in-law suggested, I'll be proven wrong. However, in today's tires-on-pavement infrastructure with our legal and regulatory system being what it is, I see an endless stream of litigation ahead for fully-autonomous consumer vehicles every time paint scrapes or worse. Toyota has already said they aren't playing.

I do see commercial vehicles - trucks, cabs, delivery, anything where a driver can be certified and monitored - as a probable target for highly autonomous, SAE level 4 (out of 5). The argument that insurance companies will want fully-autonomous for consumers will fall apart as soon as consumers realize their rates are in the hands of how good someone's system design is or isn't. The argument about driver education as a requisite for consumer licensing is a joke.

That said, I see Tesla's lawyers about to make a play here that watching Harry Potter with Autopilot on invalidates the driver-car contract, shifting the blame. Carmakers have always had an out for driver ineptitude, and they won't want a precedent here that shifts liability back to them. Tesla will have a fix plan, issue a software update, and all that, and we'll wait until the next unrecognizable scenario presents itself.
 
Here's an interesting question - which is more likely to come first - autonomous cars or wide-scale mass transit? Both are problematic, for different reasons, but mass transit has a lot of advantages in efficiency and cost. Maybe complemented by autonomous taxi services to cover last mile trips?
 
Elon responds to faulty Fortune article:

"When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article."

Misfortune | Tesla Motors

I really like this guy.
 
(Back from a break; hope you all missed me :))

As this was posted more than a week ago I - in the mean time - did not see the uproar in the press about this first fatal accident I originally expected. To me this indicates that acceptance by non-technical people and journalists of semi-autonomous cars is already higher than normally assumed. Or was it just the European soccer competition that was considered more important over-here?

I agree with Don on the fully automated car dim future. In a hybrid environment where the majority of cars are driven manually I don't see fully autonomous happening. It really is a jungle out there with motorcycles splitting lanes and people distracted beyond belief. In Asia the scooter problem is epidemic. How do you handle swarms of lawless scooters autonomously?

I agree that having full autonomous cars that can drive everywhere in all circumstances will take a long time. From the other side car manufacturers like BMW and Porche are working on self-driving cars that allow hands-off driving in more controlled circumstances like highway driving. I do expect that to take place in the next decade, including the legal framework.
 
From a legal standpoint, the drivers will surely be held responsible.
However, from an ethical standpoint, Tesla is far from clean. Calling their system "auto-pilot" is at best misleading.
Personally, I think it is much worse than just misleading. This is a driver assistance system, not an auto-pilot.
Tesla must totally rework their marketing message about their system if they are sincere about ethics. It's not enough to have legally correct fine prints to cover their backside. They must be totally transparent about what the system can and cannot do and what the drivers must and must not do. The technology is good enough and the roadmap is convincing enough for them to be truthful.
 
But of course Staf :cool:(missed you that is). I lost interest in the soccer when England were beaten by Iceland. On top of Brexit. Unhappy times for the English..
 
Back
Top