The crash of Ethiopia Air Flight 961 may have a negative impact on the development of autonomous vehicle technology. The Federal Aviation Administration (FAA) is now forced to reconsider the “self-certification” process used for the Boeing 737 Max 8 airplane involved.
Self-driving car developers have been seeking the same self-certification for their own systems. The FAA’s failure suggests that may not be good enough. The U.S. Department of Transportation’s National Highway Traffic Safety Administration is taking public comments regarding a regulatory framework for autonomous vehicles, which are already cruising U.S. highways. The challenge for air and land travel governed by software is the same: Understanding and regulating the algorithms and code inside black boxes.
Regulating surface transportation is complicated by the role of local regulators in the U.S. – the 50 states. It is complicated by the fact that many of these states view automated driving technology as the killer app capable of easing congestion, reducing vehicle emissions and enhancing mobility for disadvantaged or disabled populations.
But the killer app might become itself a killer. With each new fatality attributed to an autonomous vehicle, come the investigations to determine what sort of algorithmic failure led to the crash. Thus far, from Florida to Arizona, the source of the software shortcoming seems to have been successfully located – but that may not always be the case.
In the case of Ethiopia Air 961 the story appears to be even worse as multiple reports suggest that a software update was either in the works – to correct the failure experienced by Lion Air Flight 610 – or ready to be implemented, but failed to deploy in time. Another layer to the story derives from the self-certification where, according to reports in the Washington Post, Boeing employees were more or less deputized to act as FAA representatives as part of the process.
Further still, reports have emerged that pilots and airline representatives being shown the new system in the Boeing 737 Max 8 identified multiple areas requiring further training and preparation:
For cars it is not a case of training drivers. It is a case of training machines to drive. We go from the forensic “black box” of the airline industry to the emerging A.I. black box of the self-driving car industry. The question is whether we are inclined to put our “faith” in A.I. from self-driving car developers or with regulators. We already know the limitations of the regulators.
Conferring our faith to the A.I. black box in the self-driving car reminded me of the A.I. challenges currently facing the health care industry. In the words of one interviewee, Dr. Eric Topol, cardiologist and the founder and director of the Scripps Research Translational Institute, quoted in the New York Times last week:
“There’s no shortage of deep liabilities for A.I. in health care. The liabilities include breaches of privacy and security, hacking, the lack of explainability of most A.I. algorithms, the potential to worsen inequities, the embedded bias and ethical quandaries.”
Self-certification in the airline industry – necessitated no doubt by expenses and staffing limitations at the FAA – will now come under renewed scrutiny. Will self-certification be good enough for self-driving cars?
The incompetence and failure at Boeing and the FAA raise unavoidable questions regarding the regulation of transportation. The latest fatal Tesla crash (with a semi-trailer), just a few weeks ago, and these two 737 Max 8 crashes are testing the tolerance of transportation users.
–U.S. safety agencies to investigate fatal Tesla crash in Florida – CNBC
The debate calls to mind a presentation I gave last week at a security conference put on by the Metropolitan Police in the U.K. I concluded by noting the likelihood that regulators will require the ability to remotely control autonomous vehicles. In other words, regulators will not allow autonomous vehicles unless there is a provision to control them remotely.
Not surprisingly, some of the law enforcement members in the audience wanted to discuss the topic in further detail. The story of vehicle remote control is both old and new.
General Motors and Hyundai Motor America offer remote vehicle slow-down functions as part of the stolen vehicle tracking and recovery solution in their telematics offerings for passenger vehicles. Brazil attempted to mandate vehicle immobilizer technology several years ago, but abandoned the effort over privacy and security concerns. Finland’s regulatory authority requires a driver for a certified autonomous vehicle, but the driver need not be IN the vehicle.
Remote control is the main differentiator between airline and surface transportation, which is far more deadly than flying. For now, the airline industry and its regulatory authorities have determined that the risks of remote control for airplanes are greater than the rewards. For cars, it is increasingly looking like remote control will be essential.
Even with remote control, though, the challenge of certification and regulation remains. In the U.S., states are opting for less regulation, not more. An audience at the Future Networked Car Symposium at the Geneva Motor Show voted by a show of hands slightly in favor of more regulation – perhaps reflecting the presence of executives from multiple European regulatory authorities.
Ten years from now we will somehow arrive at the nirvana of autonomous vehicle technology where we are saving lives and reducing congestion, and emissions, and parking garages are eliminated completely. There are going to be bumps along the way. Fasten your seat belt.
Also read: Surviving in the Age of DigitalizationShare this post via: