You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!

EDA Clearing Clouds or Cloudy Computing?

Other than providing a plethora of catchy titles and vivid imagery, the term cloud computing is in need of an identity change if not permanent retirement. For one it does not mean anything. Clouds do not compute. This meme has run its course but the underlying concept it brings has yet to unfold to its full potential for reasons listed below. By this I mean the reasons why it has potential and why it has not yet unfolded fully. The potential of this emerging discipline/capability is enormous and should mean a lot. A lot of cycles, a lot less overhead, a lot more utility, a lot of results, a lot sooner. For the purposes of this post let us rename cloud computing pervasive on demand utility computing and let us examine the interesting angle of delivering EDA tools via this mechanism.

View attachment 1472<script type="text/javascript" src="http://platform.linkedin.com/in.js"></script><script type="in/share" data-counter="right"></script>

I have written about this topic before. What has made this computing model attractive is the continuing decrease in hardware and related servicing cost of the Information Technology infrastructure if applied at a large scale, the improvement of secure access and hosting and the strides made in addressing requirements of software-as-a-service from a workflow and invoicing standpoint. A seminal Berkeley paper published in 2009 defined the computing paradigms and associated economics and benefits of this utility computing model.

The ‘EDA in the cloud’ compelling proposition includes the following considerations:

  • Increasing Design Complexity requires large computing resources that work better if distributed over many servers, specially at key steps (regression tests, logical and physical verification and so on).

  • Peak usage can be easily handled with paying only for resources used for the needed tools and only for the duration used without worrying about hardware infrastructure deployment, upgrade and maintenance.

  • Concerns about security have largely been addressed, through encryption, secure access.

  • Data integrity, version control, backup, disaster recovery and multiple-site collaboration mandate location independent computing and information content storage.

  • Flexibility in using best of breed point tools versus one stop shopping with captive flows.

  • The computing requirements are increasing so quickly that the logistics abilities for individual organizations to upgrade and maintain are being challenged. The cycle of upkeep includes managing increased storage, CPU, memory, license needs with support personnel in CAD, IT being required by providers and users alike where applying scale may not make sense economically if surge demands are existent but not frequent.

  • Standardized reference flows are making it possible to have scripted and homogenized design techniques and solutions applied in a design factory mode where scripted jobs are sent to be run and examined later without incurring transacting costs for viewing and analyzing.

  • Foundries are increasingly becoming a natural host for design IP, content and information repository for key process and manufacturing information that goes along with design content information such as design kits, third party IP and waivers or process rules.

  • Start-ups may be able to afford the use of expensive tools and so will medium and large companies who could restructure uses for project activity-based accounting that will be able to measure ROI and value propositions on a more granular basis instead of spreading EDA costs ‘peanut-butter style’ with no ability to know what was used, worked or even abused.
The challenges for on-demand computing are:

  • Disruption to a key traditional EDA transacting model which relies mainly on multiple-year time based licensing with incentives for all you can eat, one tool provider maximizing the share of increasingly challenged and limited budgets unable to keep up with the rising cost of development and thus constraining either productivity or choice when alternative choices may be better for some focused tasks. Clayton Christensen’s Innovator’s Dilemma states that disruptive technologies can make successful companies fail if they continue doing what made them successful in the first place.This is a call for action of sorts to all stakeholders.

  • Concern that EDA development will shrink if perceived returns are no longer projected to grow. One might counter that growth is challenged already in EDA and that moving to on-demand computing could in fact energize the industry with new tools/solutions approaches and with possible new entrants or extended use by existing customers who find that more can be done if capital and operational expenses can be dispensed on an as needed basis as opposed to sunk costs whether a tool or a server is used some time, all the time or in some cases not at all but is there because it was bundled. Not to mention the planning flexibility afforded by on demand resourcing.

  • Intellectual Property Security as an issue keeps coming up even though technology has largely addressed this with tagging, encryption, provision of binaries, and access control and monitoring.

  • Pricing models have to come down and they seem to be scaling down driven by key infrastructure providers and operators. As it currently stands many departments can build a case for continuing to build their own infrastructure even though they would not mind trying public cloud implements if the price is right.

  • Job security is another factor fueling uncertainty and doubt. Concern about CAD departments and IT organizations becoming impacted is sometimes voiced, more often feared. The fact that restructuring is inevitable does not mean though that jobs will disappear. The thriving software infrastructure industry is a testament that job growth can exist if novel solutions are being applied to the EDA industry. One might even say that the EDA industry is missing out on all the innovation taking place in the social media and web 3.0 activity currently occurring.

  • Standardization of transaction models and homogenized look and feel for the end user so the experience is wholesome, systemic and predictable. The EDA industry here needs to step up and collaborate instead of going its own individualized way of deploying cloud constructs which could actually be less revolutionary, and more biased towards own strengths instead of looking at enabling the whole ecosystem in a cohesive manner.

  • One real challenge is the various unique use models that end-users or departments have. One EDA executive correctly pointed out that it is hard to get two departments to agree on what to use and how to do things and where to do it let alone having the whole industry or ecosystem concur. The way to address this is to have a top down driven imperative at the corporate level and a concerted industry action to coordinate best practices, models and transactions while preserving their own proprietary pricing and strategies of what to build, how to create the code and who their audiences are intended to be.
I was very encouraged by the cloud panel session at the latest DAC in San Diego. There was the beginning of realization of doing something by EDA vendors, start-ups, users and foundries. The language has started shifting from denial and providing reasons not to do anything to articulating a wish to get there and voicing a commitment to address the challenges ahead. Jacques Lacan, a French psychoanalyst, stated it neatly by stating that ‘language points to a lack’ which loosely translated means ‘if you are talking about it, it means you are not getting it’. To be sure, I do not want to oversimplify the challenges which are real but I do think we have to start tackling this with more bravado, eagerness and open mindedness and with less sub-optimizing half measures.

What do you think? Is it a hype, myth or a new reality? Inquiry minds want to know.
 
Last edited by a moderator:
In the clouds?

A couple of half-baked thoughts:

Hardware and infrastructure costs appear relatively small compared with software and support. For moderate-sized companies the major benefit would be a "true turnkey" solution.

Perhaps the "ideal" cloud provider would be closely aligned with the foundry? This would relieve fabless houses of the hassle of kit installation and maintenance. The foundry is probably also in a good position to negotiate a good deal with the EDA providers.
There could also be advantage for any foundry that invests in extended models, interoperability, specialised tools, etc. - as IP leakage becomes more controllable.
 
Great comments, George. On your first point, you are correct that hardware and infrastructure costs are small compared to software and maintenance but the challenge is the latency between upgrading and using the infrastructure unless you over-provision which could be costly. On the second point, foundries are absolutely a natural for providing the launchpad with pre-installed kits and configured environments with secure IP portals and pre-negotiated foundry rates from EDA suppliers. One incarnation could be foundries bundling this as a service to customers. This is how our semiconductor industry got its start: foundries providing design services, tools and manufacturing in one stop shop. Back to the future? By the way EDA can still prosper in this mode of fewer but larger buyers of tools.

A couple of half-baked thoughts:

Hardware and infrastructure costs appear relatively small compared with software and support. For moderate-sized companies the major benefit would be a "true turnkey" solution.

Perhaps the "ideal" cloud provider would be closely aligned with the foundry? This would relieve fabless houses of the hassle of kit installation and maintenance. The foundry is probably also in a good position to negotiate a good deal with the EDA providers.
There could also be advantage for any foundry that invests in extended models, interoperability, specialised tools, etc. - as IP leakage becomes more controllable.
 
Back to the future?

"foundries providing design services, tools and manufacturing in one stop shop." Back to the future? By the way EDA can still prosper in this mode of fewer but larger buyers of tools.
Actually, this model is where some of us started - albeit the "foundry" was an IDM's in-house fab, the "cloud" was a shared "supercomputer", and the data was exchanged in (tedious) text format over a modem. This was of course updated using intranets, and thence -at least partially - transferred to secure access over the web; however, this model remains pretty-much restricted to IDMs. More recently, since the onset of general broadband access, it should have become a practical and convenient possibility for foundry suppliers and so for fabless users.
So why has it not happened (yet)?
Given that this would also potentially remove some of the day-to-day support that foundries in practice provide multiple small users, my guess would be contract-conservatism on the part of EDA companies. Am I missing the critical issue, perhaps?
 
Last edited:
@George: There were initially some valid reasons not to do it in a public cloud setting (screen response time was sluggish for interactive work, security was not guaranteed, design collaboration was not practical, server scaling expensive and the application designs just did not require the compute capability). What does remain probably is some concern about short term revenue impact from large EDA vendors (a possibility but the long term outlook may be worth the hassle). I think contract-conservatism may be less of an issue with smaller EDA. My personal gut feel is that there is actually unrealized revenue to be made by all of EDA if transacting over cloud goes mainstream. There will be more use by more users on most tools. What may not be used will be sub-par tools, but that is a threat and an opportunity for all tools to improve. This reminds me of when DVD/VHS sales were thought to reduce movie theater revenue by Hollywood. What happened of course caused revenue to rise through this new channel. Even if ticket sales were impacted, Hollywood made out well in the final analysis.
 
server scaling expensive??

I wouldn't argue most of this, although I would have thought that the higher the cost of computing the more cost benefit would accrue to sharing the resource.
Of course, hardware is no longer the dominant feature for designers of modest systems, but I suspect that the relative cost of hardware required for finishing (full chip analysis, layout and post-layout simulation) is actually continuing to increase as design size increases, so even the hardware motivation will remain in place for high-end users.
 

a.a

New member
Why the cloud

EDA clouds make sense, but it would take time for customers to agree to relinquish control over the software. For instance, as we speak our company about to roll out cloud based lithography simulator.It can process more data faster than any other commercial optical simulator. I can prove that. But...

I know that promoting the system will be very difficult. Sending design data over the internet seems to be giving a fit to the very people who bank and buy online every day. The tidal wave will come, but it is not there yet.
 
@a.a : True enough. The same reluctance happened when internal IT departments started requiring design data to be moved from the engineer's desktop to a central file system located inside the corporation. At some point, it made sense to just do it and engineers adapted. The challenge is to highlight the positives and not make it feel like a take-away.
 

Daniel Nenni

Admin
Staff member
SemiWiki is in a private cloud that is optimized for MySQL. They handle software upgrades, bugs, tuning, and other DBA stuff for people like me who would rather be doing other things.

Cloud also offers the EDA industry the opportunity to move towards a success based business model rather than the cancer like "all-you-can-eat" licensing we do today.

View attachment 1476

Unfortunately I do not see EDA cloud success on the horizon. I will ping the Synopsys cloud guy David Hsu and see if he will give us an update. Maybe James Colgan, CEO of Xuropa can chime in as well.

Cheers,

D.A.N.
 
Last edited:

Paul McLellan

Active member
One of the things that I see as a big challenge is the volume of data in EDA. If we are just talking about a customer setting up their design flow using something like Amazon AWS then fine. But if we are looking at something more like the Salesforce.com model whereby different EDA vendors have cloud offerings, then a mixed flow (and all flows are mixed) requires the data to be moved between different sites. Salesforce.com would work a lot less smoothly if every day or two you had to move the entire CRM database to Oracle and back in order to get access to some feature Salesforce.com lacked. And that database is small compared to the polygon level of a modern chip.

Paul
 
The lure of cloud EDA is precisely the removal of worrying about the upkeep and allowing unencumbered focus on design realization. Synopsys is dipping its toes in the cloud by starting to offer verification only services but through their own managed access and they are using Amazon infrastructure. Xuropa has a good angle of enabling EDA tool evaluations and allowing EDA vendors to offload the initial evaluation logistics to them. The evaluation angle to end customers in fact is one of the additional attractiveness elements of EDA Cloud which I forgot to mention in the above write-up.
 
@Paul: It would indeed be cumbersome to move files/databases around and this is the opportunity at hand to create a consistent environment that would allow multi-tenancy for customers while keeping all the files out there (transporting the environment from local multi-tool to virtual multi-tool). Salesforce and Facebook work because there is a common code owned by one entity. So for this to work in EDA, that one entity needs to be a federation of companies willing to inter-operate in a consistent fashion (think Android style many companies standardizing on protocols and interfaces and platforms for transactions but keeping their own styles and implements). I view this like transacting Walmart style, one place to shop with many vendors putting their wares on the shelf with the billing and invoicing PayPal or phone bill style. Barring consensus, one company could take the lead on this and open it up for others to interface in a consistent fashion. The value-add would also be possibly canned automated flows. The GUI would involve running the tools desired and having a dashboard style dynamic summary of what jobs, tools, scripts are ongoing by project, user customer, EDA. EDA vendors will not be able to see other vendors code/tools, customers will not be able to see other customers work space. But customers will be able to invoke multiple tool vendors, knowing what recurring costs go with what feature of the EDA tool. It sounds complicated, but if we put our mind to it, it can be done and there is a tremendous opportunity to redefine how designs are done. This may also mean that an independent non-EDA software infrastructure or a foundry outfit is a natural for hosting this launchpad. Vendors configure their space, users configure their working space on top of an infinite compute environment. The devil may be in the details, but we need to start getting there :).
.
One of the things that I see as a big challenge is the volume of data in EDA. If we are just talking about a customer setting up their design flow using something like Amazon AWS then fine. But if we are looking at something more like the Salesforce.com model whereby different EDA vendors have cloud offerings, then a mixed flow (and all flows are mixed) requires the data to be moved between different sites. Salesforce.com would work a lot less smoothly if every day or two you had to move the entire CRM database to Oracle and back in order to get access to some feature Salesforce.com lacked. And that database is small compared to the polygon level of a modern chip.

Paul
 

ScottClark

New member
I see a lot of good dialog here, and just wanted to add a couple of comments:

I couldn’t agree more that the term “Cloud” is overused and has no clear definition, but if we think about what the industry went through in the 80’s with the movement toward foundries, most of the same dynamics are present with respect to datacenters for those same companies.

Camille calls out a key point that I think is the real driver for interest in Cloud from EDA, and what I believe will be the inevitable follow-through on cloud adoption by EDA companies. That point being the capacity and complexity of the infrastructure is outpacing the individual company’s ability to stay ahead of it. This is for a couple of reasons: One has to do with how IT is traditionally budgeted and funded, and the other is that capacity and complexity are multiplicative cost drivers for infrastructure development, creating non-linear cost impact. This is why we should probably hold off on the expectation that Cloud will be a cost reduction. I think it will be a cost reduction over what it should cost to implement internally when properly implemented, but I don’t think many companies have a grasp on what that cost should be (it is not an extrapolation of history).

Hardware, while being only a percentage of the EDA license cost, is still a significant number (anywhere from 10%-20% of the EDA budget) and an improperly implemented or insufficient infrastructure will cause EDA spend to have to go up (at 5X-10X what the infrastructure spend should be) .

We should also probably only tackle one hill at a time, so having cloud address only the infrastructure portions today, and not tackling EDA license models, not tackling EDA methodology issues. There is a lot to gain just in doing the infrastructure more efficiently and in a manner that accomplishes greater economies of scale.

If we use a Community Cloud approach (purpose built clouds specifically targeted for EDA), that address the security concerns, are implemented in a way to address the latency sensitive aspects of interactive design, and are implemented in a way the EDA infrastructure business already understands (make the cloud match the business, avoid making the business have to change to use the cloud).

Foundries are a great option for EDA Community Clouds, there is even some historical precedence there as George Storm pointed out.

It should be about enabling engineers by unleashing technology, not controlling cost by managing technology (managing and controlling strategies invariably end up costing more in the big picture by limiting technology use).

I would be curious to hear other opinions…

-swc
 
@ScottClark: Thanks for chiming in. I do agree that we should walk before we run, and not skip the exercise altogether. My view is that the EDA industry needs to be more pro-active and less defensive about turf or difficulties and a good start would be talking about what the road map should be for the industry as a whole. And the dialogue needs to include users and foundries and infrastructure providers. I am all for unleashing technology and creativity and making the cloud match the business of product realization (with some tweaks to EDA transacting along the way). It sounds like we need a conference to tackle the new way forward.
 

Daniel Payne

Moderator
I see EDA companies dipping their toes in the cloud:

Synopsys - HSPICE for Circuit Simulation, VCS for Verilog simulation

Nimbic - 3D Field solver

Cadence - Used Amazon AWS with a few customers

Xuropa - EDA tool evaluations in the cloud

PDTi - Register management

Tabula - 3PLD devices, tools in the cloud

Altium - Cloud-based design services
 
Thanks Daniel. It is a start. Now we need to help those stars become part of a well behaved galaxy (pick any tool a la carte, use it from the cloud). I also immediately invoke the decree of no more word plays on this galactic theme (nebula, comet, collapsing star, you know all the analogies that come to mind). On second thought, let us hear the imagery everyone.
 
Cloud EDA still in cloud cuckoo land?

On second thought, let us hear the imagery everyone.
How about "Head in clouds? Best keep feet on ground" (Only slightly reworded from my high-school report).
By which I mean that cloud support will fall apart if data transfer between tools is not properly supported, so the whole thing is only of any use if integrated as part of a complete process.
In the IC design business, I think this can mean one of the following:
. Foundries provide PDK and applications via the cloud (initially perhaps a differentiating factor - ultimately might become as motherhood as providing a PDK is at the present)
. Third parties (neither current EDA company in present model nor foundry) provide as service
. Internal cloud support (Large IDMs and fabless are already doing this for themselves in all but name)
. Fabless companies' consortium (in my dreams)
. EDA companies change business model (this one looks like head in clouds at present...)
Feel free to add..
 

ScottClark

New member
George,
what we should see emerge out of the cloud space as it matures is "Community Clouds" which correlate to your consortium concept. Really it is just purpose built Clouds that have all the dynamics required by EDA (Things like sufficient bandwidth to make data transfer not really a problem, access to specialty systems [FPGA, large memory systems, etc.], bin1 processors, data centers located in or near appropriate geographic regions (where the Semi companies are), etc... This is a brand new model, but it seems to be heading in the right direction...

-swc
 

LinkedIn

Active member
The benefits of cloud computing are obvious. We are quickly joining the ranks of companies who think that it is the future. Certainly when I did digital implementation it was a constant battle between predicting our license use and making sure we had the physical resources to use them. Cloud computing solves both these problems, reduces waste and gives everyone an excuse to do a bit of house keeping. Posted by Rosemary Francis
 
Top