Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/is-there-a-comparison-on-the-eda-tools-and-their-effectiveness.3201/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Is there a comparison on the EDA tools and their effectiveness?

RPLATT

Banned
All the companies rave how their tool is better than the other guys, why can't we get some users to actually give us the consumer reports version of EDA tools for semiconductor design and manufacturing. I am just plain tired of hearing the hype with no real side by side comparison from those who would know.

If there was a top 5 list of DFM EDA tools, sort of a comparison chart, or some such evaluation, so that individuals knew how to screen the wheat from the chaff?

I mean isn't fair for those of us who are forced to use the tools that at least we have some kind of comparison so we aren't being snowed by salesman?

Can I get a right on?


Thanks,
Richard Platt

<script src="//platform.linkedin.com/in.js" type="text/javascript">
lang: en_US
</script>
<script type="IN/Share" data-counter="right"></script>
 
Last edited by a moderator:
I think it usually says in the license agreements that you can't publish benchmark results so the numbers tend to stay inside the companies that do run comparison tests. A second reason is that performance gains can be patchy: having worked in parallel simulation I can say that a 3x speed-up is possible, but only in certain cases and 0.1x is equally possible in the wrong hands, so I probably wouldn't want to give you the tool for random benchmarks either.

If you want the numbers I would post a benchmark test somewhere and ask for anonymous reports on the results. However getting your hands on a good benchmark that someone will let you publish will be hard.

Try DeepChip.com for hearsay.
 
Hey Simguru - I see what you've just written is coming from someone who actually knows how such a benchmark would actually need to be set up, and done so independently, so as not pervert the outcomes and favor one EDA provider over another. I don't think I need permission as I don't have an EDA license just yet, but I am very interested in the information and knowledge thereof.

I absolutely could provide such a location as well as publish the results so that they are available to the world, free of charge, your help in this would be much appreciated, if not outright you writing it. Someone we could do for the industry. I would just be happy enough to assist in the effort in any way that I can. I am not bound by such egregious, anti-customer centric licensing agreements. So how about this let's connect on LinkedIn and we can discuss specifics at our leisure? What do you think?

Thank you so much for your suggestion and proposal.

Best
Richard Platt - Managing Partner
Work: 503.421.9391
Email: richard.platt@sig-hq.com
[url]http://www.linkedin.com/in/richardplatt[/URL]
Skype: richard.platt101
 
Hey Simguru - I see what you've just written is coming from someone who actually knows how such a benchmark would actually need to be set up, and done so independently, so as not pervert the outcomes and favor one EDA provider over another. I don't think I need permission as I don't have an EDA license just yet, but I am very interested in the information and knowledge thereof.

I absolutely could provide such a location as well as publish the results so that they are available to the world, free of charge, your help in this would be much appreciated, if not outright you writing it. Someone we could do for the industry. I would just be happy enough to assist in the effort in any way that I can. I am not bound by such egregious, anti-customer centric licensing agreements. So how about this let's connect on LinkedIn and we can discuss specifics at our leisure? What do you think?

Thank you so much for your suggestion and proposal.

Best
Richard Platt - Managing Partner
Work: 503.421.9391
Email: richard.platt@sig-hq.com
http://www.linkedin.com/in/richardplatthttp://www.linkedin.com/in/richardplatt
Skype: richard.platt101


Wasn't that why Cooley started deepchip.com in the first place? What could you offer that he is not already doing?
 
Mr. Eaton, if I thought that the comparison was self-evident and easy to understand then I would agree with you. Having conducted multiple benchmarking efforts for Intel and a # of other businesses, while managing consultants, s/w programs, platforms and having to review those w/ senior managers and the results derived of such efforts, well all I can say is that the DeepChip.com approach doesn't meet the need here.

Please see this link for the kinds of standards that people are expecting: http://www.orau.gov/pbm/presentation/kendall.pdf

All the best,
Richard
 
Tiana Rahaga
I understand your point. However your qurestio, I am affraid, will never get objective and relevant answer. Dynamically, from one to another version of atool, the survey is unsustainable.Otherwise one has to spend his time by benchmarking. May be an opportunity to create a new business here? At a given time one tool is weak for this and better for that. In the average the competition try to fill the gap. May be the main concern is the business model which allows EDA vendors and Design and Si manufacturer to work in a win win manner.
 
Last edited:
Philippe Faes
As you know, the license agreements of all big EDA tools prohibit the publication of benchmark results. Since we can't publish any results, the only thing we can do is share the benchmarks themselves and run them on an evaluation version for internal use. Any pointers to serious sets of benchmarks will be highly appreciated.
 
Esa Tiiliharju
I realized a 65-nm CMOS chip in 2009, and back then it seemed that cadence design kit integration was very good. So good actually that I have later understood cadence has/had a near monopoly in IC foundry design kits. On the other hand mentor verification tools (calibre) were needed to complete the design, so I guess they were the leader in that area.

I think effectiveness is also tied with what kind of design is done, ours was pure analog research stuff, I think digital guys could come up with a different set of tools ?
 
John Beaudoin
As I remember, built within the contract language are clauses that prohibit such benchmark data from public release. Any company doing such a comparison and releasing the information publicly would be in violation and would soon be a defendant in a large litigation case.

Also, companies doing benchmarks on their own invest valuable resources to do them. Why would they share such information to help their competitors? At the deep process development submarket, there is no mutual benefit to such a benchmark like there is in the commodity submarkets of EDA (such as PCB design or HDL simulation).
 
Bill Fuchs
Having a completely independent third party analyze the usefulness and effectiveness of EDA tools would be a wonderful asset to have. Unfortunately as I discovered back in the 1990's this is a very impractical scenario as many of the tools are too expensive for a 3rd party to acquire and test. More over developing a viable test bench would take considerable time and expense. Furthermore many EDA vendors would be reluctant to have their tools tested side by side with their competitors.

This would be an excellent area of research for EDAC. Get users to contribute models and designs that would comprise the test bench and get the EDA tools vendors to provide permanent multi-platform licenses. Make the test results a purchased report that has no bias to any particular vendor or design type. The usefulness to end users would be invaluable and it would reveal which tools are mostly hype and which tools can really handle the tough jobs. It should be done like a Road & Track car report measuring each tool against a given set of consistent bench marks.
 
EDAC is made up of member companies, and none of those member companies wants to provide data that their competitor has a faster or better EDA tool in a benchmark.
 
Dear Tiana,
The inputs & feedback I've received from others, as well as my own experience w/ EDA and DFM/A tools does not concur w/ your assertions. I do agree w/ you that there is a business model issue here for the EDA vendors, and it is ripe for being disrupted.

And that would be in the best interest of the user / OEM / ODM market place.

Yet I have not seen EDA tool vendors behave in manner where such an attempt at providing assistance and feedback in getting to a win-win, it is not in their DNA that I've seen. I have even attempted to address this issue directly in the past in particular w/ Mentor, I don't appreciate such obtuseness when it comes to my representing a large OEM and the vendor does not act in a responsive manner even when they would gain long term benefit because of short term financial short sightedness , especially when it comes to being a good supplier. As mentioned earlier I've managed these kinds of relationships before, and part of what I do is help the vendor/ supplier to be a better partner to their clients.

Thank you so much for your comments and feedback.
-Richard
 
Phillipe that is part of what we're discussing here now, my suggestion is keep throwing in intelligent inputs, feedback and asking the tough questions that we would want to have answered collectively as group of concerned users,
 
Hey John, I am not constrained by that contract language of the vendors, hence why I suspect why I was originally asked to help assist in tackling this issue a couple years back. The issue resurfaced again for me recently in a different fashion as it relates to the limitations of EDA and DFM tool in my research that the tools are limiting the designs and manufacturability that all of our companies face as we move forward towards tighter and more demanding product and process envelopes.

So this is in all of our best interests to get a much crisper visibility into the EDA and DFM tool vendors products, as we all risk product launches, profitability, time to $, etc, etc.....because of the lack of visibility to the tool capability. Collectively we are not having any favors done for us by the EDA and DFM tool vendor community, regardless of their contracts that they've managed to get everyone to sign, they need to serve us as we ask and need to be served. Not the other way round.

-Richard
 
Hey Bill, keep on providing the inputs bud, and we will get there. Either we create one as a group here w/ Daniel Nemi's help, as it is in the best interests of the community, or we can do it elsewhere on LinkedIn or elsewhere or even as you mention a 3rd party will be enlisted into our cause. And I can assure you of this much Bill, if I am involved w/ this 3rd party that does this, the EDA and DFM tool vendors are not going to be happy. They are in my professional opinion one of the worst sets of examples of being a poor vendor/ supplier, and like I've mentioned before they don't know how to behave appropriately.
 
Mitch Alsup


It really depends on what you are trying to design {Highest possible performance, low design resources high performance, low design resources pretty low power, lowest possible power, or just soft compiled IP}.

For highest possible performance the design tools (ahem) blow wind.
For lowest possible power, the design tools (ahem again) blow wind.
For soft IP and low design resources, the design tools are great.
 
All very good points Mitch, and the benchmarking should take that into consideration as a part of the effort, for the reasons that you mention. Different customers have different needs, no one size fits all. Totally agree w/ ya bud
 
Jeremy Birch
You can of course understand why vendors try no to allow such benchmarking to be released to the public. But there are other issues that muddy the water here:
1) rarely will a customer have time, expertise or motivation to really benchmark all possible candidates, so any individual benchmark will be unrepresentative of the tool space
2) tools have sweet-spots, so a given set of benchmark data might find the sweet-spot of one vendor's tools and not find the sweet-spot of another's. So results released for this benchmark would be unrepresentative of the problem space
3) benchmarks are often won or lost based upon the prejudices of the customer, the talent of the field AE's, the amount of core developer time devoted to winning that benchmark etc. Again these will not be the same for the next guy who tries to use the tool, so the experience will differ.
4) often customers have very little interest in telling their competitors which tool-flow they found works best for their type of design, as it is likely to only help their competitors.

Even if all benchmark results were open, you would need some common way of characterising the data so that the reader could see which ones matched to their own type of design, and the benchmark results would need to be aggregated in some way to smooth over the sweet-spot issues so that some common trend was visible. Not an easy task.

We could of course just have competitions ie each vendor is given common data and they do the best they can with it, and this does indeed happen in some areas, but how representative the results are is dubious.
 
Back
Top