In “Willy Wonka and the Chocolate Factory” circa 1971, Gene Wilder plays a vaguely misanthropic Willy Wonka who leads the young winners of his golden wrapper contest on a tour of the seven deadly sins within his candy factory and labs. (Who can forget Augustus Gloop?) At one point, Mike Teavee, a television-obsessed pre-teen, is so enamored of Wonka’s experimental Wonka-vision that he insists on being transmitted – despite a half-hearted warning (see above) from Wonka himself.
Mike is thrilled when the device “transmits” him into a faux TV set, but when he steps out of the set it is clear to all but him that he has suffered a likely irreversible shrinking process to the shock and horror of his mother. Mike’s glee is undimmed. Wonka gives a shrug.
Something similar appears to be playing out with Tesla Motors as the company has rolled out Autopilot 2.5 and Tesla owners are taking more liberties than ever with the system. As the first production vehicle with advanced automated driving level 2 capabilities such as lane changing and passing, people have been taking liberties with Autopilot-equipped Tesla’s since day one, with fatal results for at least one driver.
Tesla’s equivalent of Gene Wilder’s half-hearted warning is the admonition that the driver must keep his or her hands on the wheel at all times and pay attention to the road ahead. It’s no surprise that drivers continue to ignore these warnings (suggestions?).
The results can be interesting, like the drunk Tesla driver who claimed he wasn’t actually driving or the heart attack victim who claimed autopilot got him to the hospital. The latest episode of the Tesla follies is the Tesla driver who put his feet out the window during an “Inside Edition” interview and was subsequently pulled over by a police officer. The driver received a ticket for going too slow (25 miles per hour) in a 65 mile per hour zone – but the ticket was later dismissed. It seems the traffic code may need a rewrite to cope with semi-autonomy.
The real news, though, is the Autopilot 2.5 update. Tesla has been in the midst of a process of playing catch-up since the fatal crash in Florida two years ago, after which Mobileye (a supplier of the camera system in the original Autopilot) parted company with the automaker.
Forced to rely on its own in-house algorithms, Tesla quickly down-shifted with a software update (downgrade?) and instituted a new geo-fenced version of Autopilot that only worked in certain driving environments and at certain speeds. Over time, the geo-fence expanded and the speed restrictions were relaxed and, with the release of 2.5, Tesla may have finally achieved parity with or surpassed the Mobileye-enabled performance of the original Autopilot.
With Musk’s claimed plan to deliver full autonomy via Autopilot, this may be good news or bad. Is the Model S (or X or 3) really ready or capable of full autonomy? And what exactly is full autonomy? Can a Tesla perform like a Waymo? Probably not for a while.
The concern is that Tesla throws the driving candy out to the sinners and more or less looks the other way (Stop. Don’t. Come back.) as the misbehavior unfolds. Try to pull Tesla-like shenanigans in a Cadillac with Supercruise and the car shuts the feature down.
There’s got to be more to corporate responsibility – enforced in real-time in the vehicle a la Cadillac – than a CEO Pied Piper crooning Wonka-like “Come with me…”
The issue is highlighted in a review of vehicle videos released by the Tempe, Ariz., police in connection with the fatal crash of an Uber autonomous test vehicle with a pedestrian walking a bicycle. The driver was looking down, distracted, but the vehicle sensors ought to have detected the pedestrian in spite of the nighttime circumstances. (Guess who Uber is going to blame.)
This is yet another case where the safety driver can and likely will be blamed – but one would also be correct in holding Uber responsible for the failure of the system. The video evidence suggests a failure of the system hardware or software. Putting such a system on public roads for testing suggests a certain degree of Wonka-like indifference. Automated vehicle-friendly states like Arizona ought to think about implementing sanctions for such failures to encourage a more responsible approach from the testers. Without consequences there will be no progress.
https://tinyurl.com/yalzhgpc – What happened when driver put his Tela on Auto Pilot? – Inside Edition
https://tinyurl.com/ybzab85o – Uber driver looks down for seconds before fatal crash – Ars technica
Share this post via:
Next Generation of Systems Design at Siemens