The Reverend Elon Musk, CEO of Tesla Motors, held forth to his flock on yesterday’s earnings call. Musk described at length his efforts to lead the company out of production hell. The lengthy session highlighted the challenges facing the company, which posted its greatest quarterly loss ever, and was emblematic of the typical high-flying technology entrepreneur making a big bet against long odds.
Unique to Tesla, though, is the commitment it seeks from both investors and the drivers of its cars. In fact, Musk went so far as to insist that the media and analysts must commit more fully to the company’s vision for achieving mass electrified, sharable vehicle autonomy – or risk being described directly as idiots or worse.
He has no patience for scary headlines which tend to follow the relatively infrequent fatal crashes of Tesla vehicles. He confirmed on the call that the amount of recorded automated driving that occurs in autopilot-equipped Tesla vehicles tends to decline after such reports only to recover later. In Musk’s eyes, these discouraging downturns in autopilot usage only further delay the arrival of full autonomous operation – i.e. the promised land.
Musk also takes issue with the inclination of the press to blame-shame Tesla for pre-emptively releasing data implicating drivers in their own autopilot misadventures, fatal or otherwise. Musk did not mention the growing number of crashes (after all, there are more Tesla’s on the road), fatal and otherwise, that appear to be the result of autopilot shortcomings.
This is where the leap of faith is most difficult to take. Musk and Tesla were on solid ground two years ago (can it be that long?) following the fatal Florida crash. Tesla took multiple steps in response to that event including:
- Parted with camera system supplier Mobileye
- Laid blame for the crash upon a misuse of autopilot (on a non-limited access highway)
- Updated autopilot software and geo-fenced its usability
- Conducted a study of vehicle data – sharing the results with the National Highway Traffic Safety Administration – demonstrating that vehicle crashes were substantially reduced in Tesla’s equipped with autopilot
Ultimately, the scope of Tesla’s geo-fencing expanded such that it was nearly unlimited thereby completing the transition away from Mobileye. Still the system remains ill-suited to secondary roads and side streets with intersections.
Intermittent crashes continue including a fatal crash in California, once again blamed on driver inattention, along with non-fatal crashes with parked vehicles on highways. One crash stands out from the rest, though, having occurred shortly before the fatal Florida crash.
The crash occurred in China and Tesla was reportedly unable to retrieve the vehicle’s data logs due to the severity of the crash. As a result, the fatal China crash between a Tesla and a truck parked in the high speed lane of a highway, remains unresolved along with pending legal action from the family of the driver.
Given the fact that the China crash occurred before Tesla modified its automated driving system software, it is difficult to conclude whether any findings from that crash will be relevant to understanding how autopilot, as currently configured, functions today. More importantly, in spite of the availability of video from the vehicle, Tesla asserts that it is not possible to conclude that autopilot was operating at the time of the crash.
Infrequent though they may be, Tesla crashes continue to occur. Operating a Tesla in autopilot mode – and, in fact, investing in Tesla – requires a leap of faith. You have to believe in Tesla and Musk’s vision of autonomous operation.
Tesla has not been the only autonomous vehicle operator to experience a fatality. Uber suffered the same fate, though the fatality was a pedestrian not a passenger or driver. In the Uber crash in Tempe, Az., a number of factors were blamed including driver inattention and a malfunctioning or inactive automatic braking system. Some astute observers also noted that a thermal sensor might have aided detection of the pedestrian on the dark road.
Tesla famously makes use of ultrasonic, radar and camera sensors supported by Nvidia processors to analyze and fuse the incoming information and enable automated driving. Also famously, Musk disparages the use of Lidar sensors, which some believe might have prevented both the Florida and China fatal crashes.
An Israeli startup, BrightWay Vision, asserts that its image gating technology, for enhancing camera-based sensor inputs, might have also prevented the China crash. The gating technology might have been able to overcome what is thought to have been radar interference caused by the picket fence-like highway barrier on the left side of the vehicle.
The bottom line: To buy a Tesla or its stock or to operate a Tesla in autopilot mode is to take a leap of faith. It is a leap of faith in the gigafactory strategy, the fast charging network, SpaceX, increases in battery power density, reductions in cost, increases in production and the pursuit of profitability. It is a leap of faith in a CEO who sleeps on the factory floor and occasionally indulges in Tweet storms.
Mainly, it is a leap of faith that the best and most ethical decisions are being made regarding vehicle autonomy. If Musk were CEO of any other car company he would long ago have been crucified by investors, regulators and customers for the handful of fatalities that have occurred in Tesla vehicles. Musk spent a fair amount of time on Wednesday’s call apologizing for his rants and insults from the previous earnings call. What Musk really needs to do is thank his customers – especially those who have adopted autopilot – for their leap of faith.
Share this post via:
TSMC 16th OIP Ecosystem Forum First Thoughts