One of the sessions I enjoyed at the Synopsys Verification Day 2021 was a presentation on applying formal to a couple of non-traditional problem domains. I like talks of this kind because formal can sometimes be boxed into a limited set of applications, under-exploiting the potential of the technology. Intel have built a centralized team of formal experts who seem to be quite aggressive in exploring new ways to leverage their expertise and tools. The first of these talks was on using formal to root-cause problems found post-silicon and the second was on validating datapath implementations.
Applying formal property verification to post-silicon
Anshul Jain gave this talk and opened by acknowledging that there is skepticism about applying formal to post-silicon debug; he agrees this isn’t easy. He does a nice job of walking through challenges and explains a recipe Intel have developed over many years to help them apply formal most effectively in the context of these challenges. I won’t attempt to explain the details other than to note a few. Formal can handle blocks, not full chip, so they brainstorm their way with designers to likely block candidates. They use cover properties to work their way towards pre-conditions for failure. They aim to constrain judiciously, with simulation state in areas that are not critical to the failure, with under-constraints around the sensitive area. All good basic engineering judgment.
Eventually they converge on a root cause. Importantly, they use this reactive analysis to enhance proactive property checks for next generation designs. He cited one example in which they were able to find 8 post-silicon bugs over the last year. Which they spun into new pre-silicon checks, finding 6 bugs in the next generation design. Pretty good ROI!
Datapath formal verification
Datapath functions are notoriously resistant to formal methods, however Synopsys seems to have solved that problem with their datapath validation (DPV) technology. This is based on equivalence checking with a reference rather than property checking. Disha Puri talked about Intel’s ground experiences in what works best for their needs.
The default use model is to compare an RTL implementation against a C/C++ reference. This works but suffers from complexities in software optimizations in the reference for virtual modeling needs or other reasons unrelated to ultimate implementation. Then there are mapping choices – between what interfaces do you want to check equivalence? Also optimizations in the synthesis step. Getting to closure on a check can take significant effort, which may need to be repeated on minor changes in the source. Still a worthwhile task for an initial pass and signoff, but burdensome for iterative development.
Instead they tried RTL to RTL equivalence checks, using a legacy RTL as a reference. Still using DPV. I’m guessing conventional EQ would be hopeless on datapaths. They used this flow on Media designs and a graphics unit, building regression suites in one generation which they applied to the next generation and found 90+ bugs in a matter of weeks.
Equivalence checking without an obvious reference
Disha talked also about methods to apply DPV when they don’t have a reference model. For example, in extended math functions, they don’t have effective C++ references. A need of this kind arises for example when a big new block is added to a datapath element. Apparently, it is possible to trigger bounded model checking in DPV. The use this feature to apply a simple property check. She said this had a simple setup, quickly checked a lot of opcodes and found 15 bugs. Property checking can still have value in datapath verification!
Very nice couple of talks. You can learn more from the recorded session. This session is listed about 2/3 of the way through Day 1.Share this post via: