Big SoC designs typically break existing EDA tools and old methodologies, which then give rise to new EDA tools and methodologies out of necessity. Such is the case with the daunting task of verification planning and management where terabytes of data have simply swamped older EDA tools, making them unpleasant and ineffective to use.
Last week I spoke by phone with John Brennanof Cadenceto learn about their decision to develop a totally new EDA tool for SoC verification planning and management. This is a product area familiar to Cadence users with a 10 year history of the Incisive Enterprise Manager(IEM) tool. The new tool is called Incisive vManager and it was designed to handle the biggest SoC verification tasks by using:
- A client-server approach
- Sophisticated verification management
- A scalable, database-driven technique
With the old way you had to comb through reams of data to see if you can optimize your verification, while with the new way you collaborate with all team members throughout the design and verification process, where everyone has easy access to the progress. Benefits of using the new approach include about a 2X improvement in functional verification efforts, once you get fully trained.
Reducing verification times and improving coverage are a big deal because the typical SoC at 40nm had a $38M verification cost, from data supplied by IBS 2013. For projects using 20nm, that verification cost can be $100M.
A project manager really wants to know a few things:
- What is my schedule until DV is complete?
- What are my costs to reach tape out?
- Are there any functional bugs that will cause a re-spin or recall?
With vManager you can improve schedule predictability, verification productivity and design quality. Here’s a closer look at how this happens. By having all of your functional verification metrics visible in a GUI you can work on the most critical failures first:
If there’s a block on your design with a low test grade, then the manager can shift verification resources to get caught up:
Failure analysis determines which failures are the same, and helps identify only the most critical, thereby eliminating redundant cycles:
Finally, for optimizing your verification plan there is benefit to finding precisely where your coverage holes are, so that you are quickly aware and can take early action:
Verification productivity improves by:
- Up to 30% using reporting automation and closure automation
- About 25% better compute farm utilization with MDV (Metric Driven Verification) versus directed testing
- Up to 10X improvement in bug discovery with MDV versus directed testing
- A 60% reduction in verification time with MDV
The vManager tool flow has the following components:
If the vManager approach looks interesting, then you can learn more by attending a two day workshop, followed by your own evaluation.
In summary, you should consider this new generation tool if your current generation tools are limiting the number of runs and coverage nodes required:
If you already use Cadence tools for your functional simulator, formal and hardware acceleration then give vManager a look. This new tool has been used by ST and others over the past year, so it sounds field-tested to me. If you visit DVcon in March then consider attending a Cadence tutorial or attending a customer paper presentation. Should you be tempted to write your own functional verification management environment then expect to spend about 50 man-years of development effort and ~2 million lines of code to catch up to vManager.
lang: en_USShare this post via:
There are no comments yet.
You must register or log in to view/post comments.