The news of a pedestrian fatality in Tempe, Ariz., resulting from the operation of an Uber autonomous vehicle has set off alarm bells throughout the AV development community. As always in such circumstances there will be a simultaneous rush to judgement and the immediate termination of all such testing, as well as a call for calm as investigators investigate. For the time being Uber’s testing has stopped.
The highly likely outcome is a finding in favor of Uber after the pattern set by Tesla Motors and followed recently by Cruise Automation: the parties responsible for the robot driver will attribute responsibility for the fatality to the human – either the “driver” in the Uber or the pedestrian. Actually, this is a pattern established more than a century ago by the makers of human driven cars, which allowed car makers to almost completely ignore safety enhancements to cars until well into the second half of the 20th century, by which time millions of humans (drivers, passengers and pedestrians) had been killed by vehicles.
Humans have long been the fly in the ointment of automation. Whether chewed up or maimed by ravenous factory equipment or simply savaged by trains or other moving vehicles, human beings have long been, in the words of “Die Hard”’s John McClane, “the fly in the ointment, the monkey in the wrench.”
Humans have served as the speed bumps along the path to the very progress intended to enrich human existence. In the case of the Uber fatality, though, an essential element has changed. The machines are now able to testify in their own defense, while the humans are left defenseless.
This new pattern started with the now two-year-old fatal crash of a Tesla in Florida. Tesla not only used the vehicle data to show that the Model S in question was being miss-used by the human (a fact that was manifest by the location of the crash), the company used the vehicle data and that of thousands of other Tesla’s to assert that vehicles equipped with Autopilot generated fewer crashes and claims.
Cruise Automation has been quick to follow-up reports of multiple collisions with non-automated vehicles in the San Francisco area by pointing out the blamelessness of its own automated cars. The Cruise vehicles invariably had stopped short to avoid collisions and were, in turn, rear-ended by other vehicles.
Cruise is unique in experiencing at least one incident of human-inflicted violence (mild) on its cars, suggesting a visceral reaction among humans to the existence of self-driving vehicles. Similar efforts to interfere with or impede self-driving cars have occurred elsewhere.
Open hostility to self-driving cars, though, has largely been limited to more formal resistance from safety advocates who oppose the SELF Drive Act currently before the U.S. Senate. The bill has stalled despite widespread support among car makers and previous passage in the House of Representatives.
I ran into a senior auto industry executive at the Geneva Motor Show who expressed strong support for the bill, only to later run into a senior executive of self-driving car maker Waymo on the show floor who indicated indifference to the legislation (even though Waymo itself is a member – along with Ford and Volvo – of the Self-Driving Coalition for Safer Streets – which supports the bill). The bill provides exemptions from Federal Motor Vehicle Safety Standards which call for safety equipment on self-driving vehicles including steering wheels and brake pedals.
The legislation may be less important to Waymo because Waymo is positioning itself as a solution provider, not a car maker. Waymo is providing a transportation service. If anyone is interested in the operation of its vehicles, the company’s guidance is: “Read our safety report.”
Waymo’s safety report describes the operation of its vehicles. This description is seen as adequate to fulfill the requirements of regulators and law makers. There is no need for an exemption nor an abdication of responsibility or liability.
Thankfully, Waymo has neither killed nor injured any drivers or pedestrians, suggesting that, indeed, the company’s operational safety vision – so far – is a sound one. More saliently, this is in the context of Waymo’s long-held and oft-stated intention of delivering cars without steering wheels or brake pedals. Waymo’s is not an incremental approach to automation. It is all or nothing.
Waymo’s approach is not exactly an expression of hostility toward human drivers. It is a simple founding philosophy of its program. Uber, on the other hand, arguably has a history of hostility or at least ill treatment of its drivers, which puts a different paintjob on the manner in which the public will process news of an Uber-inflicted pedestrian fatality.
It is appropriate that Uber suspend its testing to determine what precisely failed during the human-monitored self-driving process that caused a pedestrian fatality. But the vehicle will surely defend itself and Uber AVs will be back on the road within days or weeks. The humans (driver and pedestrian), meanwhile, will be left to plead their case, or at least the driver will, to a skeptical jury of their peers: transportation investigators. (Uber successfully exonerated itself in a previous crash with a Volvo, also in Tempe, AZ. Too bad Uber and Waymo can’t collaborate, in spite of being neighbors.)
Should all such self-driving car testing be suspended as a result of the Uber failure? Probably not. But the incident highlights the challenge of proving a negative – i.e. proving that a crash or injury did not occur because of the presence of safety systems or a self-driving robot. Has Waymo simply been lucky?
Autonomous driving development is now universally seen by legislators and regulators as a source of technological progress and leadership as well as a creator of jobs and a sponge for investment capital. This automated economic engine itself is on autopilot and not likely to be slowed by the humans any time soon – even if the humans throw themselves in the path of progress. The “roads must roll,” to steal a phrase from Robert Heinlein’s book of the same title.
It is only a matter of time before self-driving cars begin clamoring digitally for the complete removal of human drivers from the roadways in the name of efficiency and safety. Is it too late for the human drivers (and pedestrians) to rise and resist? Yippee ki yay! Old rules die hard.
– Self-Driving Car Kills Pedestrian in Arizona, Where Robots Roam – NYTimes.com
– Uber’s Self-Driving Car Showed No Signs of Slowing before Fatal Crash, Police Say – Theverge.com
Share this post via:
Build a 100% Python-based Design environment for Large SoC Designs