WP_Term Object
(
    [term_id] => 64
    [name] => Solido
    [slug] => solido
    [term_group] => 0
    [term_taxonomy_id] => 64
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 58
    [filter] => raw
    [cat_ID] => 64
    [category_count] => 58
    [category_description] => 
    [cat_name] => Solido
    [category_nicename] => solido
    [category_parent] => 157
)

Variation Analysis

Variation Analysis
by Paul McLellan on 07-18-2011 at 1:33 pm

 I like to say that “you can’t ignore the physics any more” to point out that we have to worry about lots of physical effects that we never needed to consider. But “you can’t ignore the statistics any more” would be another good slogan. In the design world we like to pretend that the world is pass/fail. But manufacturing is actually a statistical process and isn’t pass/fail at all. One area that is getting worse with each process generation is process variation and it is now breaking the genteel pass/fail model of the designer.

For those of you interested in variation, there is an interesting research note from Gary Smith EDA. One of the biggest takeaways is that, of course, you are interested in variation if you are designing ICs in a modern process node, say 65nm or below. In a recent survey of design engineer management, 37% identified variation-aware design as important at 90nm and all the way up to 95-100% at 28nm and 22nm. If you are not worrying about variation now, you probably should be and certainly will be. 65nm seems to be the tipping point.

Today, only about a quarter of design organizations already have variation-aware tools deployed with another quarter planning to deploy this year. The only alternative to using variation-aware tools is to guard-band everything with worst-possible-case behavior. The problem with this is that at the most advanced process nodes there isn’t really any way to do this, the worst case variation is just too large. The basic problem is well illustrated by this diagram: for some parameter the typical (mean) performance advances nicely, but the worse case performance doesn’t advance nearly so much since the increased variation means that some number of standard deviations from the mean hardly moves (and can even actually get worse).  Inadequate handling of variation shows up as worse performance in some metric, or forces respins when the first design doesn’t work or, when problems get detected late in the design cycle, lead to tapeout delays.

All the main foundries have released reference flows that incorporate variation and analysis tools, primarily from Solido Design Automatioin.

 Solido is the current leader supplying tools to address variation. The tools are primarily used by the people designing at the transistor level: analog and RF designers, standard-cell designers, memory designers and so on. STARC in Japan recently did a case study and the Solido variation tools exceeded STARC’s performance specifications across process corner and local mismatch conditions. Solido is also in the TSMC 28nm AMS 2.0 reference flow and have been silicon validated.

Gary Smith’s full report is here.
Solido’s website is here.
TSMC AMS 2.0 Wiki is here.

Share this post via:

Comments

0 Replies to “Variation Analysis”

You must register or log in to view/post comments.