J38701 CadenceTECHTALK Automotive Design Banner 800x100 (1)
WP_Term Object
(
    [term_id] => 95
    [name] => Automotive
    [slug] => automotive
    [term_group] => 0
    [term_taxonomy_id] => 95
    [taxonomy] => category
    [description] => 
    [parent] => 0
    [count] => 749
    [filter] => raw
    [cat_ID] => 95
    [category_count] => 749
    [category_description] => 
    [cat_name] => Automotive
    [category_nicename] => automotive
    [category_parent] => 0
)

What GM Can Learn from Tesla

What GM Can Learn from Tesla
by Roger C. Lanctot on 01-28-2018 at 7:00 am

21034-what-gm-can-learn-tesla-min.jpgGeneral Motors has had wireless connections to its cars for more than 21 years, thanks to Project Beacon, better known as OnStar, now operated as Global Connected Consumer Experience. OnStar has likely saved hundreds of lives, if not thousands, by summoning emergency responders to the scenes of crashes where airbags deployed.

To this day, the rest of the automotive industry remains of two minds regarding OnStar’s “automatic crash notification” functionality. Some luxury car makers have replicated and implemented the feature. Ford Motor Company offers a smartphone-based equivalent dubbed 911 Assist. But the feature never became the checkoff item it was intended to be and U.S. government regulators never saw fit to require it – unlike Europe where a local equivalent called eCall will become standard equipment on new type-approved vehicles later this year.

Even Tesla Motors chose not to implement an OnStar-like function in its cars. Every Tesla Motors vehicle is built with an embedded telecommunications connection, but if you crash in a Tesla and you are unconscious you will have to depend on the kindness of strangers to alert emergency responders.

Arguably Tesla vehicles were not designed with the prospect of a crash in mind. Yet Teslas continue to experience crashes including one famous fatal one in Florida in 2016. The thinking at Tesla may be that it is best to avoid crashes altogether rather than focusing on what to do after one occurs.

Now, both Tesla and GM, caught up in the struggle to deliver an automated driving proposition, have experienced nearly simultaneous collisions potentially tied to automated driving systems. In GM’s case, a motorcyclist has filed a lawsuit after a low-speed mishap involving a Cruise Automation test vehicle. For Tesla, a Model S equipped with Autopilot (activated) collided with a firetruck parked on the side of the road.

– GM Faces Lawsuit after Crash between Motorcyclist and Self-Driving Chevy Bolt – theverge.com

– Tesla Crash with Autopilot Triggers Safety Board Interest – Bloomberg.com

Tesla and GM have faced similar safety challenges in the past. In Tesla’s case, the company used vehicle data for its own forensic purposes to determine 1) that the Florida crash was entirely the fault of the driver of the vehicle misusing the Autopilot feature and 2) that Tesla vehicles equipped with Autopilot create fewer insurance claims.

In GM’s case, the company is still wrestling with the aftermath of the ignition switch crisis which has led to multiple individual and class action lawsuits, a $900M fine from the U.S. Department of Transportation and multiple Capitol Hill hearings to respond to Congressional questions as to how the ignition switch vulnerability had been overlooked for as long as it had. The possibility of GM exonerating itself Tesla-style with telltale vehicle data was never on the docket.

But knowledgeable industry observers and even some incredulous Congressional representatives were asking themselves, if not GM, how the company could have failed to detect the ignition switch failing from the data collected from OnStar-equipped vehicles in the wild to say nothing of inferences that might have been drawn from diagnostic scans at dealerships. For GM, it was the internal e-mail trail that proved determinative.

Maybe in this latest incident in California GM will be able to finally steal that page from the Tesla playbook and show, from vehicle data, that the Chevy Bolt was blameless in the low-speed collision with the motorcyclist. It’s an essential turning point for GM and the global automotive industry.

GM’s Cruise Automation unit, which is running tests in California, is in a contest with Waymo for self-driving car leadership. While Waymo has put up gaudy (i.e. impressively low) disengagement event data, Waymo has been focused on operating in the suburbs. Cruise has been subjecting itself to the acid test of operation in an urban environment – San Francisco, to be specific – with all of the conflicting modes of transportation pedestrian and otherwise and some unique topography and street restrictions.

It’s no surprise that Cruise has been experiencing way-more (wink) collisions than Waymo ever did. But it’s no excuse and GM/Cruise need to clear up the source of responsibility for the crash and make any necessary corrections before the otherwise minor incident is whipped up by safety advocates into a full-blown emergency brake moment for automated driving.

Why is it so important that we stay the course on automated driving? Because! Because current estimates are that approximately 37,000 people were killed on U.S. highways in 2017, the second consecutive annual increase. This is more than 100 daily fatalities and ignores the even more terrifying figures for injuries, to say nothing of the economic impact.

Automated driving and all of the advances leading up to it hold the promise of actually putting a dent in those figures and mitigating the economic impact and personal tragedy simultaneously. The minor incidents characteristic of current real-world testing of automated driving technology pale in comparison to the massive carnage being inflicted daily by non-autonomous driving activities. Where is the outrage?

We do need to learn, though, from these experiences so let the investigations of both incidents begin. With some luck GM will learn from Tesla and use this event as an opportunity to educate regulators and the general public as to the extent of its responsibility and the nature of the crash. As for Tesla, if history is any guide, Tesla will forage through its data trove and deliver up an exculpatory assessment of events preceding and leading to the Model S crash with the firetruck. (Maybe Musk will work his Jedi magic and have us believing that the crash never happened in the first place!)

GM is attempting to school Tesla and Alphabet as to how to deliver a self-driving car to market by 2019. Before GM schools any competitor, though, it needs to learn a few lessons of its own as to how to put vehicle data to work. Tesla and Alphabet are leaders in this respect and GM has some catching up to do.

As for avoiding collisions with motorcycles, GM and other car makers (and smartphone makers) ought to take a closer look at Ridar Systems. Ridar offers an app capable of alerting drivers to the presence of a nearby motorcyclist. A Lyft driver tooling me around Las Vegas two weeks ago might have benefited from the use of such an application in the near-miss event we shared. There are solutions around for common every day driving challenges – if one just looks closely enough. As for GM and Tesla, the answers to autonomous driving are in the data.

Share this post via:

Comments

One Reply to “What GM Can Learn from Tesla”

You must register or log in to view/post comments.