ClioSoft has been working with the leading cloud computing providers running experiments on various EDA cloud architectures for a while now. One example of that was a project with Google I previously wrote a blog about, For EDA Users: The Cloud Should Not Be Just a Compute Farm. Since then, ClioSoft has also teamed up with Amazon Web Services (AWS) to show examples and talk about best practices for designing in the cloud. This information was shared at a webinar on Thursday, October 17th, 2019. You can sign up to view the replay of that webinar here.
All of us have heard about the advantages of on-demand computing. Some of the EDA companies have now come along and offered licensing solutions to accommodate that. However, there are multiple ways to architect EDA solutions in the cloud. I think it is important that everyone understands the trade-offs with various cloud architectures. Design Data Management tools, such as those that come from ClioSoft, provide additional benefits to cloud architectures, though it is not the case that “one size fits all” when it comes to implementing your cloud architecture. In fact, at a high level, there are at least two dimensions to the architectural choices – the tool architecture and the data architecture.
When considering the tool architecture in a cloud environment, we are describing where tools will run. Today that even applies to interactive tools. Cloud services are giving us ever-decreasing latencies, and since we can render full-motion video over the internet, it should not be a problem to render interactive EDA tools over the internet. However, to work optimally, we need to have the correct hardware for each tool. We also need to understand EDA tool workloads – how many resources, for how long, at what point in the design flow?
Data architecture is also critical to your efficiency and cost. You need to decide where you will keep each type of data. However, much more than that, modern solutions involve caching data. You also want to consider persistent storage in the cloud. Where are the master copies of each type of data (e.g., library data, design data, simulation results, etc.)? Where are the caches? There are lots of decisions. Depending on your tools, it may be difficult to change your architectural choice later. The benefits are tremendous, but you also want to be correct as possible on your initial implementation. To do that, you need information on all the optimization parameters you have at your control on AWS – Amazon EC2 Instance Types, Operation System Optimization, Networking, Storage, and Kernel Virtual Memory. There is a lot to learn about and control. Do you know what an AMI is?
In addition to superior design data management solutions, information is exactly what ClioSoft has been preparing for its customers. The information shared in the previously mentioned blog was quite helpful. Now ClioSoft has followed that up with this webinar collaboration with AWS. Of course, ClioSoft is in the AWS partner network.
Speaking for AWS in the webinar is David Pellerin, the AWS Head of Worldwide Business Development. Dave has an interesting background. He has been with Amazon for more than seven years. He has worked in a variety of fields, including accelerated and reconfigurable computing, data center and cloud services, HPC software development tools, field-programmable gate arrays, financial computing, life sciences, and health IT. Dave has also authored several books related to programming and design, including VHDL Made Easy. Clearly, he understands EDA, too.
Also speaking in the webinar is Karim Khalfan, VP of Application Engineering at ClioSoft. I have known Karim for a very long time, and I appreciate that not only does he have a deep understanding of design data management, but that he also has a knack for making these complex issues easy to understand. Adding Dave’s experiences in textbooks, I think everyone will be able to learn a lot from this webinar.
Also ReadShare this post via: