WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 412
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 412
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
    [is_post] => 1
)

Hardware Assisted Verification

Hardware Assisted Verification
by Paul McLellan on 06-10-2013 at 9:00 pm

 On the Tuesday of DAC I moderated a panel session on Hardware Assisted Verification in 10 Years: More Need, More Speed. Although this topic obviously could include FPGA-based prototyping, in fact we spent pretty much the whole time talking about emulation. Gary Smith, on Sunday night, actually set up things by pointing out that emulation is the heart of how EDA is going to take over more and more of embedded software development, in his view. And Wally, at DVcon (I think), pointed out that emulation is now the cheapest verification on a cycles per $ basis compared to plain old simulation. Of course with Synopsys’s acquisition of Eve, all 3 major EDA vendors now have an emulation solution. The panel was organized by Frank Schirrmeister of Cadence.

The panelists were:

  • Dave Bural, of Texas Instruments in Dallas
  • Alex Starr of AMD in Boston
  • Mehran Ramezani of Broadcom, standing in at the last minute for Vahid Ordoubadian who had an urgent meeting and couldn’t make it.

Since I was moderating the panel and keeping things moving, i couldn’t take detailed notes. So this is more of a stream of what I remember than any attempt to be comprehensive.

 The general opinion of everyone is that capacity of emulation was keeping up with what they needed. While everyone would always love more, they were managing with what they had. Since the underlying technology of emulation are FPGAs and since they improve on the same Moore’s Law trajectory as the designs being emulated, this will probably continue. However, all the panelists felt that performance was not improving as fast as they needed. Unfortunately, nobody was forseeing any miracle breakthrough, just gradual improvement.

Emulation has clearly gone mainstream. Not that long ago, emulation was very expensive and very hard to use, taking weeks if not months to bring up a design satisfactorily. As a result it was used almost entirely by the most advanced designs with the biggest design budgets, on a dedicated project basis. Now, all the companies represented on the panel had emulation farms which were shared among many designs. Although emulation can be used to simply “run vectors”, the primary use was to enable early software and firmware development.

 One challenge is that chip designers don’t really understand embedded software development, and software developers don’t understand chip design. Clearly they use different tools in their day-to-day environments. But emulation based software development clearly straddles this divide and there is a need for engineers with a better understanding of both sides of the coin.

I asked about how they and their teams went about justifying the investment in emulation. Although it is still a lot cheaper than it used to be, it is still not cheap. Generally, fear of a bug leaking out into the field was a big driver, especially for the initial investment in emulation when it was getting added to the standard design flow. Later one, once established, incremental investment was easier to justify since it scaled with the size and number of designs just like any other EDA tool.