WP_Term Object
(
    [term_id] => 13
    [name] => Arm
    [slug] => arm
    [term_group] => 0
    [term_taxonomy_id] => 13
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 393
    [filter] => raw
    [cat_ID] => 13
    [category_count] => 393
    [category_description] => 
    [cat_name] => Arm
    [category_nicename] => arm
    [category_parent] => 178
)
            
Mobile Unleashed Banner SemiWiki
WP_Term Object
(
    [term_id] => 13
    [name] => Arm
    [slug] => arm
    [term_group] => 0
    [term_taxonomy_id] => 13
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 393
    [filter] => raw
    [cat_ID] => 13
    [category_count] => 393
    [category_description] => 
    [cat_name] => Arm
    [category_nicename] => arm
    [category_parent] => 178
)

Don’t Stand Between The Anonymous Bug and Tape-Out (Part 1 of 2)

Don’t Stand Between The Anonymous Bug and Tape-Out (Part 1 of 2)
by Alex Tan on 03-09-2018 at 7:00 am

21306-caption3-min.jpgIn the EDA space, nothing seems to be more fragmented in-term of solutions than in the Design Verification (DV) ecosystem. This was my apparent impression from attending the four panel sessions plus numerous paper presentations given during DVCon 2018 held in San Jose. Both key management and technical leads from DV users community (Intel, AMD, Samsung, Qualcomm, ARM, Cavium, HPE, and nVidia) as well as the EDA vendors (thetriumvirate: Synopsys, Cadence, Mentor plus Breker, Oski and Axiomise) were present in the panels.

There were some consensus captured during the panels evolving around these four main questions:

What are the right tools for toughest verification tasks?
Is system coverage a big data problem?
Should formal go deep or broad?
What will fuel verification productivity: data analytics, ML?

Reviewing more of discussion details, it is obvious that few factors had constrained the pace of new solution adoption and a potentially more integrated approach.

An array of verification methods spanning from emulation, simulation, formal verification to FPGA prototyping are used to cover verification space. The first panel is to cover user’s approach to the new developments on the verification front.

Market dictates execution mode – Users supported products serving market inherently required frequent product refreshes, which shorten development and thus, verification schedule. Companies are in-turn focused in delivering-out product fast; no time to explore. As a result, currently some just keep pushing simulation and emulation instead of spending time to explore modeling; trying to manage the use of resources optimally.

Software injects complexity – In addition to growth in system size, programmable components such as security engine, encryption engine, have also contributed to the added complexity. There was a raised question on how to isolatenon-determinism and debug, if something has gone wrong. Need a tool verifying S/W that bridge into the behavioral hardware space. Also a spectrum of tools to cover from full-system → system → block-level. Is S/W causing problem that we can’t verify? Running simulation can’t be trained-up. For example, a bug found at 64-bit counter — how to catch it at top level. H/W based approach then needed. Software verification is difficult with standard tools, so need emulation. Example test such as running Youtube onWindows introduced system complexity.

21306-caption3-min.jpgEmulation and hybrid simulation – More software on-board causing increased emulation usage. Also in hybrid simulation S/W is a big unknown, while so much can be done before shipping the product. Emulation has technology to scale-up. Hybrid simulation model done before SOC constructed. Emulation is growing but space is also growing.

Simulation vs HW Assisted Efforts Ratio — In the past, it used to be 80% simulations and 20% emulation, today will it be considered 80% H/W assisted vs 20% simulation? According to panelist, simulation need has kept pace with IP growth, so not a 80/20 scenario, necessarily.

FPGA vs Hybrid— Hybrid helps, but FPGA may be needed such as for covering corner cases. Actually no difference between emulator vs FPGA. How much time needed for S/W model to be in seamless usage with emulator or FPGA is key. In hybrid environment a lot of data and transactions (such as graphics IP with lots of transactors): FPGA can’t address those and hybrid emulator would be more suitable. While others still believe that FPGA or emulator share similar challenges. Emulator faster but design more complex and bigger, so in the end yielding about the same speed (although FPGA could be faster). What about size of FPGA (scalability) to prototype or emulate product? How to tackle size issue on a more than 10 billion gates design. Do targeted testing on which subset of instances. Can we use mixed and match, get value now not waiting till last minute.

Shift-left and Cultural-divide— Does shift-left effort work? The answers are mostly a sounding yes, albeit a few with caveats. Yes, IP development at ARM involved software development before roll-out, anticipating usage although not doing system design. Shift-left has been both painful and effective. Also use H/W emulation model. Cost of using models and making it work all across involved hidden costs. Can be made easier migration to shift left. Shift left has been successful but with challenges (2 hours vs 2 weeks). We may need teams that overseeing both sides. S/W folks have faster expectation than verification (may take longer). How to use same stimulus to run simulation faster? Test intent needed and may run faster if applies in emulation realm.

Questions from the audience:
How to address A/D interface?Panelists stated that clean boundaries (sys-subsys-IP) should be key to allow partitioning system to be more manageable. The use of virtual interface (interface layer) could accomodate the need for A/D (e.g. Matlab, C) but Analog block usually has pickyrequirements.

When will we have a point tool to address versus being spread thin across using different ones?
— Panel responded that Integration issue always there (a handshake problem), it will shift problems somewhere else, hence not replacing jobs which is good news. Vendor pointed out about doing shift left early on and possibly doing testbench analysis acceleration with M/L.

S/W friendly implementation need— Hybrid simulation may address S/W centric in H/W design. Trend of more software focus. Usually H/W first and then software (when ramping/kickoff); now S/W, then H/W.

[To be continued in part 2 of 2.]

 

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.