C
Camille Kokozaki
Guest
Other than providing a plethora of catchy titles and vivid imagery, the term cloud computing is in need of an identity change if not permanent retirement. For one it does not mean anything. Clouds do not compute. This meme has run its course but the underlying concept it brings has yet to unfold to its full potential for reasons listed below. By this I mean the reasons why it has potential and why it has not yet unfolded fully. The potential of this emerging discipline/capability is enormous and should mean a lot. A lot of cycles, a lot less overhead, a lot more utility, a lot of results, a lot sooner. For the purposes of this post let us rename cloud computing pervasive on demand utility computing and let us examine the interesting angle of delivering EDA tools via this mechanism.
View attachment 1472<script type="text/javascript" src="http://platform.linkedin.com/in.js"></script><script type="in/share" data-counter="right"></script>
I have written about this topic before. What has made this computing model attractive is the continuing decrease in hardware and related servicing cost of the Information Technology infrastructure if applied at a large scale, the improvement of secure access and hosting and the strides made in addressing requirements of software-as-a-service from a workflow and invoicing standpoint. A seminal Berkeley paper published in 2009 defined the computing paradigms and associated economics and benefits of this utility computing model.
The ‘EDA in the cloud’ compelling proposition includes the following considerations:
What do you think? Is it a hype, myth or a new reality? Inquiry minds want to know.
View attachment 1472<script type="text/javascript" src="http://platform.linkedin.com/in.js"></script><script type="in/share" data-counter="right"></script>
I have written about this topic before. What has made this computing model attractive is the continuing decrease in hardware and related servicing cost of the Information Technology infrastructure if applied at a large scale, the improvement of secure access and hosting and the strides made in addressing requirements of software-as-a-service from a workflow and invoicing standpoint. A seminal Berkeley paper published in 2009 defined the computing paradigms and associated economics and benefits of this utility computing model.
The ‘EDA in the cloud’ compelling proposition includes the following considerations:
- Increasing Design Complexity requires large computing resources that work better if distributed over many servers, specially at key steps (regression tests, logical and physical verification and so on).
- Peak usage can be easily handled with paying only for resources used for the needed tools and only for the duration used without worrying about hardware infrastructure deployment, upgrade and maintenance.
- Concerns about security have largely been addressed, through encryption, secure access.
- Data integrity, version control, backup, disaster recovery and multiple-site collaboration mandate location independent computing and information content storage.
- Flexibility in using best of breed point tools versus one stop shopping with captive flows.
- The computing requirements are increasing so quickly that the logistics abilities for individual organizations to upgrade and maintain are being challenged. The cycle of upkeep includes managing increased storage, CPU, memory, license needs with support personnel in CAD, IT being required by providers and users alike where applying scale may not make sense economically if surge demands are existent but not frequent.
- Standardized reference flows are making it possible to have scripted and homogenized design techniques and solutions applied in a design factory mode where scripted jobs are sent to be run and examined later without incurring transacting costs for viewing and analyzing.
- Foundries are increasingly becoming a natural host for design IP, content and information repository for key process and manufacturing information that goes along with design content information such as design kits, third party IP and waivers or process rules.
- Start-ups may be able to afford the use of expensive tools and so will medium and large companies who could restructure uses for project activity-based accounting that will be able to measure ROI and value propositions on a more granular basis instead of spreading EDA costs ‘peanut-butter style’ with no ability to know what was used, worked or even abused.
- Disruption to a key traditional EDA transacting model which relies mainly on multiple-year time based licensing with incentives for all you can eat, one tool provider maximizing the share of increasingly challenged and limited budgets unable to keep up with the rising cost of development and thus constraining either productivity or choice when alternative choices may be better for some focused tasks. Clayton Christensen’s Innovator’s Dilemma states that disruptive technologies can make successful companies fail if they continue doing what made them successful in the first place.This is a call for action of sorts to all stakeholders.
- Concern that EDA development will shrink if perceived returns are no longer projected to grow. One might counter that growth is challenged already in EDA and that moving to on-demand computing could in fact energize the industry with new tools/solutions approaches and with possible new entrants or extended use by existing customers who find that more can be done if capital and operational expenses can be dispensed on an as needed basis as opposed to sunk costs whether a tool or a server is used some time, all the time or in some cases not at all but is there because it was bundled. Not to mention the planning flexibility afforded by on demand resourcing.
- Intellectual Property Security as an issue keeps coming up even though technology has largely addressed this with tagging, encryption, provision of binaries, and access control and monitoring.
- Pricing models have to come down and they seem to be scaling down driven by key infrastructure providers and operators. As it currently stands many departments can build a case for continuing to build their own infrastructure even though they would not mind trying public cloud implements if the price is right.
- Job security is another factor fueling uncertainty and doubt. Concern about CAD departments and IT organizations becoming impacted is sometimes voiced, more often feared. The fact that restructuring is inevitable does not mean though that jobs will disappear. The thriving software infrastructure industry is a testament that job growth can exist if novel solutions are being applied to the EDA industry. One might even say that the EDA industry is missing out on all the innovation taking place in the social media and web 3.0 activity currently occurring.
- Standardization of transaction models and homogenized look and feel for the end user so the experience is wholesome, systemic and predictable. The EDA industry here needs to step up and collaborate instead of going its own individualized way of deploying cloud constructs which could actually be less revolutionary, and more biased towards own strengths instead of looking at enabling the whole ecosystem in a cohesive manner.
- One real challenge is the various unique use models that end-users or departments have. One EDA executive correctly pointed out that it is hard to get two departments to agree on what to use and how to do things and where to do it let alone having the whole industry or ecosystem concur. The way to address this is to have a top down driven imperative at the corporate level and a concerted industry action to coordinate best practices, models and transactions while preserving their own proprietary pricing and strategies of what to build, how to create the code and who their audiences are intended to be.
What do you think? Is it a hype, myth or a new reality? Inquiry minds want to know.
Last edited by a moderator: