2019 will be remembered as the year the automotive industry decided to right-size its autonomous vehicle ambitions. Multiple auto makers tempered their vaunted claims for delivering fully autonomous cars within a few years and Daimler Chairman of the Board Ola Kälenius declared in December that the pursuit of autonomous “robotaxis” was proving more challenging than originally thought, so the company was shifting its focus toward autonomous trucks.
Kallenius added to his AV skepticism earlier this month when he said Daimler would further prioritize electric vehicle development over autonomous cars in view of urgent European and global regulatory requirements. But autonomous vehicle thought leadership at Daimler originated with Christoph von Hugo. Speaking at the Paris Auto Show in 2016, von Hugo, head of active safety for Daimler, sought to put AV ethical concerns to rest when he averred that autonomous driving systems would, first and foremost, opt to protect passengers and drivers over bystanders.
In this context it is interesting to note that 2019 actually ended with two fatal crashes of cars built by Tesla Motors – one outside Terre Haute, Indiana, and one in Gardena, California – both of which may have been using Tesla’s semi-autonomous Autopilot function. There were several unique aspects to the Gardena crash that are likely to change the conversation around the semi-automated driving enabled by Autopilot.
Among the unique aspects of the Gardena crash were the following:
- Tesla CEO Elon Musk chose not to comment after the crash.
- The two fatalities in Gardena were passengers in another vehicle which was hit by the Tesla vehicle.
- This was the first occasion of two fatal Tesla crashes in a single day.
After previous crashes of Teslas that took the lives of the Tesla drivers, Musk had been quick to implicate the drivers’ misuse or abuse of the Autopilot function (taking advantage of Tesla’s remote access to vehicle operational data), after confirming it was in use. Musk has said nothing in regard to either the Gardena crash or the crash in Indiana.
The Gardena crash, which occurred at a traffic light located at the junction of Route 91 where it becomes Artesia Boulevard, caused injuries to the driver of the Tesla and a passenger in the car, while killing the driver and passenger of a Honda Civic. According to reports from the crash scene, the Tesla ran a red light and crashed into the Honda.
In the Indiana crash, the Tesla crashed into a parked firetruck. The driver survived. The passenger in the Tesla was killed. The National Transportation Safety Board (NTSB) and the National Highway Traffic Safety Administration (NHTSA) both indicated publicly at the time of the two crashes that they would be investigating.
The deafening silence from Musk is telling. It tells us that Musk has learned to keep his mouth and his Twitter account quiet when NTSB and NHTSA are investigating fatal crashes. There are multiple potential causal scenarios in both fatal crashes.
It is quite possible that Autopilot was not engaged in either crash, in which case the likely culprit will be driver inattention or distraction. Or it may be that Autopilot was engaged, in which case, in Indiana, the system failed to identify a fire engine parked in a travel lane with its flashing lights on, and, in Gardena, the Tesla vehicle failed to recognize the transition from Route 91 to Artesia Boulevard – a transition market by an intersection with a traffic light.
The impact of the crash in Gardena is likely to be felt sometime later in 2020. If the existing pattern holds, the investigations of both the NTSB and NHTSA are likely to require nearly a year to complete, so there is a long, delayed fuse to the detonation of their findings which are likely to alter Tesla’s operations.
What has changed this time around for Tesla is that a Tesla vehicle is responsible for the deaths of other road users. Where, in the past, Tesla vehicles operating on Autopilot had failed Daimler’s key rule of autonomous technology: to first protect the driver. In this case, in Gardena, the Tesla did indeed protect the driver while taking the lives of occupants of another vehicle. In effect, the Tesla vehicle appears to have adhered to the Daimler AV principle with disastrous results.
Researchers have sometimes compared the behavior and driving characteristics of distracted drivers to the behavior of drunk drivers. This comparison is notable as a drunk might argue that his behavior is benign at least up to and until he or she decides to drive a car.
Tesla Motors’ Autopilot, too, could be considered to be benign, that is up to and until it is asked to perform in inappropriate circumstances and without the supervision of a human driver. In the Gardena case, the Tesla vehicle, if it is determined to have been operating in Autopilot, appears to have failed to recognize:
- Thee transition from highway to surface streets;
- The existence of a traffic light;
- The fact that the light was actually red;
- And the presence in the intersection of another vehicle.
The two fatalities in the Honda completely change the conversation regarding Autopilot and will give rise to the question of Federal intervention. After the fatal crash of a Tesla in Mountain View, California, two years ago the NTSB’s investigation, only recently concluded, delivered a set of recommendations to NHTSA, SAE International, the Occupational Safety and Health Administration, Manufacturers of Portable Electronic Devices (Apple, Google, HTC, Lenovo, LG, Motorola, Nokia, Samsung, and Sony), Apple, Tesla Motors, the Consumer Technology Association, and the California State Transportation Authority.
Those recommendations in their entirety can be found here: https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf
It appears that the NTSB either lacks the authority or has not chosen to assert the authority to interfere in Tesla’s operations. It has issued recommendations and, in the latest report, reiterated some recommendations which Tesla has thus far ignored. The fundamentally unique nature of the latest crash has raised the stakes for the NHTSA, the NTSB, and for Tesla Motors, which now has more than 700,000 of its vehicles on the road equipped with Autopilot, according to some estimates.
Musk has long asserted, as he did during the latest NTSB investigation, that Autopilot remains a beta product – still in development and subject to ongoing refinement. Without an immediate and affirmative effort to respond to the NTSB’s recommendations, Tesla can no longer expect the kind of NTSB wrist slap it received earlier this year following the Mountain View investigation. NTSB, NHTSA, and the public cannot countenance routine fatal crashes from Teslas – especially now that we know that it isn’t just Tesla drivers that are at risk.
More Headwinds – CHIPS Act Chop? – Chip Equip Re-Shore? Orders Canceled & Fab Delay