WP_Term Object
(
    [term_id] => 159
    [name] => Siemens EDA
    [slug] => siemens-eda
    [term_group] => 0
    [term_taxonomy_id] => 159
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 752
    [filter] => raw
    [cat_ID] => 159
    [category_count] => 752
    [category_description] => 
    [cat_name] => Siemens EDA
    [category_nicename] => siemens-eda
    [category_parent] => 157
)
            
Q2FY24TessentAI 800X100
WP_Term Object
(
    [term_id] => 159
    [name] => Siemens EDA
    [slug] => siemens-eda
    [term_group] => 0
    [term_taxonomy_id] => 159
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 752
    [filter] => raw
    [cat_ID] => 159
    [category_count] => 752
    [category_description] => 
    [cat_name] => Siemens EDA
    [category_nicename] => siemens-eda
    [category_parent] => 157
)

The Evolution of Emulation

The Evolution of Emulation
by Bernard Murphy on 06-13-2016 at 7:00 am

Mentor hosted a panel on emulation in their booth at DAC this year. One thing I really liked about this panel is  that it didn’t include anyone from Mentor. Not that I have anything against Mentor employees, who are a fine bunch of people from those I know, but I find panels most interesting when the discussion is purely among customers. Lauro Rizzati moderated, which is a bit of a cheat, since he consults for Mentor, but moderators don’t do much of the talking, so I’ll count it as a customer panel.

Lauro opened with a quick history of emulation, starting with in-circuit-emulation (ICE) mode, later moving to more general application in simulation acceleration which then evolved to transaction-based emulation, followed by virtualization and then increasingly adding application areas like power modeling, network modeling and more. What he wanted to do was to explore how this broad range of usage is evolving among 3 of Mentor’s customers: Rick Leatherman (Director of developer tools at Imagination Technologies), Guy Hutchison (VP of hardware engineering at Cavium) and Alex Starr (Fellow at AMD) – left to right above, with Lauro rightmost.

Alex said that AMD has been using emulation for many, many years. They started in ICE mode but have evolved to transaction-based and hybrid models, both at the IP and system levels. He added that software-driven verification increasingly demands use of emulation. Guy said that Cavium uses emulation all the way through the design cycle and they use it purely in virtual mode. ICE mode is not practical since they don’t feel there is any way to generate realistic traffic from hardware. Rick said Imagination/MIPS has been using emulation for many years, starting in ICE mode, now moving to transaction-based.

Alex added that they still do a lot of simulation – both full-chip and IP. They do more emulation work at the platform level, as a part of the never ending effort to shift left. Software and firmware teams have been using emulation for a long time for this reason, and are increasingly using emulation in hybrid mode. Guy said Cavium only uses emulation for full-chip verification, which they break into 3 phases: performance verification, software bring-up and validation and post-silicon validation (back-porting silicon problems into emulation for debug). For Rick, bringing up software as fast as possible is very important. While most of us view Imagination as an IP company, increasingly they are providing more complete systems with software stacks for IoT markets, where system with software validation and power modeling become essential.

On ICE versus virtual modeling, Alex felt these complement each other and hybrid modes continue to be relevant. He cited a hard disk device as an example of a component still best modeled in ICE. But he and others agreed that virtual mode fixes a lot of problems – reliability, debug, remote access, easy sharing of resources and saving and replaying state (for debug). From Guy’s perspective, only virtual mode is practical – again they don’t feel it is possible in their application to model realistic traffic through hardware.

Lauro then asked about use of emulation in applications domains – power and DFT testing for example. Alex said they run both DFT and Design for Debug verification in emulation and have done for some time. Power analysis is becoming increasingly important, and the intersection between power and DFT – looking for peak power spikes in test mode – is a good example of of an area where emulation shines. Both Guy and Rick added that they are using emulation for power analysis.

Where does emulation not help? Everyone agreed that analog/RF modeling was out of scope today. For example, verifying memory training software with hardware models for DDR is something for which AMD has had to build internal solutions. Of course if you can extract digital models from analog blocks, some cases might be amenable to emulation, but hand-creating models for emulation just moves the problem to validating the accuracy of those models.

Overall an encouraging reality check on where emulation is at, where it’s headed and where there is still work to be done. Virtual is gaining ground fast, ICE still has its place and analog is still not a part of the solution. You can read more on Mentor’s view of the evolution of usage models in emulation HERE.

More articles by Bernard…

Share this post via:

Comments

0 Replies to “The Evolution of Emulation”

You must register or log in to view/post comments.