Synopsys announced verification compiler a couple of months ago and dropped hints about their static and formal verification. They haven’t announced anything much for a couple of years and it turns out that the reason was that they decided that the technology that they had, some internally developed and some acquired, wasn’t a good basis for going forward and they needed to rebuild everything from the ground up. Compared to when their technology was developed, there was advanced power management, hundreds instead of a few clocks, complex protocols and complex interconnect. At DAC earlier this month they announced their new products.
They use the front end of VCS so that anything that can be loaded into VCS can be loaded into their static and formal tools. That is not to say that their formal tools, in particular, can prove a whole SoC correct, that is just unlikely to ever happen. But they can check, for example, all the connectivity or the clock-domain-crossing (CDC) signals on a whole chip, taking all the reconvergence into account.
The performance is way up on prior performance. 4X improvement in low power checks, 60X up on formal checks and 180X on sequential checks. That’s a lot.
On low power static checking they have UPF checks, architectural checks, functional and structural checks and power-ground checks. They support all the latest low power design techniques and have a very close alignment with the implementation flows. One of the big issues with static checking is that one error can cascade lots more and so there is a high noise-to-signal ratio. A hundred errors lead to 20,000 violations and it is hard to find the real errors that need to be fixed.
The CDC checking works at the full-chip level so can find deep reconvergence bugs. It uses the same setup scripts as DC which makes adoption straightforward. It recognizes all sorts of synchronizer implementations on clock boundaries: extra FF, FIFOs, mux, handshakes and more.
On the formal side they have rebuilt formal engines from scratch for the toughest challenges. Formal is a weird technology, if one approach can prove something then it doesn’t matter that others cannot. So different engines under the hood can make the whole tool more powerful. And smart users know that if one formal tool can prove something it doesn’t matter that another cannot and so they often use several tools in parallel. The formal tools produce a waveform when they find an issue (a waveform that causes the assertion to fail) and this is fully integrated with the Verdi debug that Synopsys acquired with SpringSoft, making tracking down and fixing the root-cause a lot easier.
So new technology, several times faster, much higher capacity and easier to use. And all tied into the standard interface of verification compiler.
More articles by Paul McLellan…
Share this post via:
Comments
There are no comments yet.
You must register or log in to view/post comments.