hip webinar automating integration workflow 800x100 (1)
WP_Term Object
(
    [term_id] => 157
    [name] => EDA
    [slug] => eda
    [term_group] => 0
    [term_taxonomy_id] => 157
    [taxonomy] => category
    [description] => Electronic Design Automation
    [parent] => 0
    [count] => 3904
    [filter] => raw
    [cat_ID] => 157
    [category_count] => 3904
    [category_description] => Electronic Design Automation
    [cat_name] => EDA
    [category_nicename] => eda
    [category_parent] => 0
)

Atrenta’s users. Survey says….

Atrenta’s users. Survey says….
by Paul McLellan on 12-09-2011 at 7:32 pm

Atrenta did an online survey of their users. Of course Atrenta’s users are not necessarily completely representative of the whole marketplace so it is unclear how the results would generalize for the bigger picture, your mileage may vary. About half the people were design engineers, a quarter CAD engineers and the rest split between test engineers, verification and other things.

There are some questions that focus on use of Atrenta’s tools that I don’t think are of such wide interest, I’ll focus on the things that caught my eye.

Firstly, the method to design your RTL. Do you create it from scratch or modify existing RTL? It is now a 40:60 split with 40% of designers writing their own RTL and 60% modifying existing RTL.

When it comes to the top level RTL, there is a split between doing it manually (57%), with scripts (57%) and using a 3rd party EDA tool (12%). Yes, those numbers total more than 100%, some people obviously use more than one technique.

 On the main limitations of their current approach, designers had a litany of woes. Missing design manipulation features (35%), support not consistent and reliable (26%) and ECOs hard to handle (34%). But clearly the #1 problem is the difficulty of debugging design issues at 49%. There were many other things listed from missing IP-XACT files, IP being unqualified, to just plain “error prone”.


The final question was about what aspects of the design flow were most critical to improve. The choices for each feature were critical, very important, nice to have, not important and don’t know. So let’s take the critical and very important groups and see what the top concerns were.

First was reduce verification bugs due to connectivity problems. The next 3 are all facets of a similar problem: rapidaly adapt legacy designs, effort to integrate 3rd party IP and effort to make updates when 3rd party IP is in use. Slightly behind that is to reduce the time and effort to create test benches.

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.