Most of us would agree that safety is important in transportation and most of us know that in automotive electronics this means ISO26262 compliance. But, except for the experts, the details don’t make for an especially gripping read. I thought it would be interesting to get behind the process to better understand the motivation, evolution and application of the standard, particularly as it applies to EDA and software for embedded systems. During DAC I had a chance to talk with Rob Bates, Chief Safety Officer at Mentor, who has a better background in this area than most of us.
I should add that before moving to Mentor Rob was at Wind River for many years, most recently responsible for the core of VxWorks, including security and safety aspects, so he has a broad perspective. He started our discussion by noting that the auto industry has been concerned with electrical and ultimately electronic safety for many years, but work in this direction had moved forward with little structure until the complexity of these systems became so high that the need for a standard became unavoidable.
Automakers looked first at IEC 61508, which had gained traction in factory automation where it established a state of art standard for safety in those systems. Automakers felt this wasn’t quite what they needed so collaboratively developed their own standard, ISO 26262, first published in 2011. This set a new state of art standard for automotive systems safety process, very quickly demanded by OEMs from their Tier 1s, by Tier 1s from component suppliers and so on down the line.
Rob said that 26262 compliance naturally first impacted Mentor in the embedded software part of their business, because that software is used in final systems and is therefore intimately involved in the safety or those systems. Because Mentor has provided embedded software solutions for quite some time, they have been building expertise in the domain arguably for longer than other suppliers in the EDA space.
An obvious question is how this impacts EDA and other software tools. Rob said that the standard’s view on tools is to ask whether, if a tool fails in some manner, that can inject a failure into the device. Interestingly, this doesn’t just apply to tools creating or modeling design data. It applies just as much to MSWord for example; if a failure in that tool causes you to lose the last edit in a significant document, that falls just as much under the scope of the standard as an error in a simulation tool. The question then is whether you can mitigate/catch such failures. A design review to validate the design data/documentation against expectations meets the TCL1 level (tool confidence level). According to Rob, 80% of EDA tools fall into this category; in contrast, synthesis and test tools require a higher confidence level.
A common question from silicon product teams is why EDA companies are not required to step up to more responsibility in 26262. I’m going to cheat a little here and steal from a follow-on Mentor discussion on 26262 where Rob was a panelist and in which this topic came up. The answer according to Rob is simple. The standard does not allow any provider in the chain to assign responsibility for their compliance to their (sub-)providers. A chip-maker, for example, is solely responsible for their compliance in building, testing, etc the component they provide, just as a Tier 1 is solely responsible for compliance in the systems they provide to an OEM. What an EDA provider can do is help the component provider demonstrate compliance in use of their tools through documentation and active support through programs like Mentor Safe.
In a similar vein (and back to my one-on-one with Rob) he touched on what safety really means at each level. He noted that for example you can’t really say an OS is “safe”. The only place safety has a concrete meaning is in the final product – the car. What you can say about the OS is that it does what the specification says it will do, documentation/support is provided to help designers stay away from known problems and it provides features where needed to help those designers build a “safe” system.
Rob also touched briefly on safety with respect to machine learning and autonomous systems. Oceans of (digital) ink have been spilled on this topic, mostly from big-picture perspectives (eg encoding morality). Down at the more mundane level of 26262 compliance, Rob concedes that it’s still not clear how you best prove safety in these systems. Duplication of (ML) logic may be one possibility. Rob felt that today this sort of approach could meet ASIL B expectations but would not yet rise to the ASIL D level required for full automotive safety.
As for where 26262 is headed, Rob believes we will see more standardization and more ways of looking at failure analysis, based on accumulated experience (analysis of crashes and other failures), just as has evolved over time in the airline industry. He also believes there will be need for more interoperability and understanding in the supply chain, from the OEM, to 3[SUP]rd[/SUP] parties like TÜV, to Tier 1s, component suppliers and tool suppliers. By this he means that an organization like TÜV will need to understand more about microprocessor design as well as the auto application, as one example, where today this cross-functional expertise is mostly localized in the OEMs and Tier 1s. Might this drive a trend towards vertical consolidation? Perhaps Siemens acquisition of Mentor could be read as a partial step in this direction?
Share this post via:
Comments
There are no comments yet.
You must register or log in to view/post comments.