Paul Cunningham (Verification CVP/GM at Cadence) initiated our monthly Innovation in Verification blog to hunt for novel ideas in verification, breaking past the usual steady, necessary but undramatic pace of incremental advances. I attended a couple of sessions from DVCon Europe recently, and was encouraged to hear a couple of talks with a similar mindset. The opening keynote was delivered by Moshe Zalcberg, CEO of Veriest on what ideas hardware design and verification might borrow from software. Moshe covered a lot of territory – open source design and tooling, use of Python, Agile, more effective use of data and AI. I’d like to look here at the topic of Agile/DevOps for hardware. Particularly since Vicki Mitchell, an engineering VP at Arm, followed with a later keynote on how she is applying these today in Arm.
Waterfall versus Agile
I’d better start with a little explanation, following Moshe’s talk. Consider traditional waterfall development, the approach most of us use in design and verification today. From requirements gathering to design, implementation and verification, and ultimately to delivery, in that sequence. Here, the product is not really usable until near the end. Agile methods aim to improve on waterfall approaches though continuous delivery of value, delivering working code frequently, and maintaining a constant pace of delivery. Developers build code in short cycles called sprints, available at multiple points through the complete cycle. Shift-left is a compressed waterfall. In contrast, Agile breaks up the goal by code features and aims to complete a group of features as well as possible on each sprint drop. If it’s for a testbench for example, each sprint should deliver a working testbench (or family of testbenches) for some set of features.
How Agile helps
These practices have already become common in software development. Software team leaders assert that an agile flow provides higher quality results because developers have to fully test what they build in the current sprint. It also provides better schedule predictability and difficult problems are surfaced more quickly for resolution. Obviously you have to embed testing tightly in development in this approach – unit testing, coding standards, static analysis and so on. There are tons of unit testing frameworks in the software world to help automate this task.
Application in Arm
In hardware design and especially verification, Moshe admits we are in early days of adoption of such practices. However it is starting to happen, notably in Arm in the systems group. Vicki Mitchell, VP of central engineering for that group gave a second keynote on how they’re using DevOps in that role, also in supporting customers through reference design system verification for example. She brings to this role a lot of background in running software engineering organizations at other companies such Intel.
Vicki mentioned, incidentally, a key motivation for Agile approaches – lack of clear requirements and user feedback. Customers are figuring out on-the-fly what they need, and competitors aren’t standing still. That creates much more churn in development and a greater need for agility. Which leads to a need in Arm’s eyes to make agility actionable. Vicki talked particularly about DevOps rather than Agile. These two processes look very similar to a simpleton like me. My takeaway is that DevOps has an in-house focus (development + operations, aka build, regression, delivery etc). It aims for very quick feedback and it has a big focus on automation.
Arm DevOps automation
What I found particularly interesting in Vicki’s talk was Arm’s implementation and learning for continuous integration. Their gatekeeper flow monitors as you check in a change with integration tests. They use change-set checks to determine which tests should be run and will up-vote or down-vote your submission on each test. Test sets bloat fairly quick in this automation. They apply machine learning to periodically cull down to an optimized set. They’ve also developed tools to automate building integration tests. She summed up by noting that they’ve been able to improve scheduling and provide more frequent deliveries to stakeholders. The CPU team at Arm (Austin I think) is now piloting a similar program.
Interesting insights. You can learn a lot more from the talks themselves. These are still available as recordings from DVCon Europe, through November 23rd. This is the Moshe keynote, and this is the Vicki keynote.