Big data is a headline phrase that I see appear almost weekly now in my newsfeed, so it’s probably time that I start paying more attention to the growing trend because it does impact how technology-driven, EDA tool flows are being used. From my last trip to DAC I recall only two companies that were really focused on system-level design, which is a ripe application area to start mining and controlling big data while building an electronics-based system. Michael Munsey is an expert in system-level design and works at Dassault Systemes, so I went back to a conversation that Michael had in August about the topic of EDA and big data.
Q: As an overview, just what does Dassault have to offer?
So at Dassault Systemes we’re known as the 3D experience company. We started off as a 3D CAD company, and then brought in other pieces of technology like a PLM system, some multi physics modeling solutions, and we’ve grown over the years to actually encompass twelve brands across twelve different industries.
In the past couple of years we’ve been really focusing on the semiconductor industry because we’ve seen that there are a lot of system level issues that have been solved in other industries that we serve. Industries like the transportation, mobility, automotive, aerospace and defense. We have solutions that have worked very well for those industries for many, many years. We’re working on taking those solutions and creating a semiconductor solution based off of this proven technology and now we’re rolling it out to the semiconductor industry.
Q: What is happening with big data and EDA tool flows today?
The biggest challenge right now is that data is only as good as what you have available. Now we’ve seen a lot of people talking about capturing design data and performing big data analytics on it. That works, provided that you have all the data.
What we’ve seen however, is that it’s very difficult for companies to come up with processes in place to capture everything that they need to capture. Design companies have multiple tools in the EDA flow chain, in different domains and you also have system tools. You have functional verification tools, you’ve got synthesis tools, physical design, custom IC design. There are a lot of engineers involved sharing a lot of information, and there is often no structured processes to actually capture all of that data, so ultimately you will not have a full set of data to actually get good analytics off of.
The second challenge is that it’s not just about the design data and the design results. You’ve got to think about the entire ecosystem, a semiconductor company has product engineering that sets up project schedules, requirements systems and handles issuing defects across entire systems. You have manufacturing teams, so if you’re a fabless company then you have teams that interface with the foundry. An IDM will have their own manufacturing information as well. So to get the true analysis that you need requires a comprehensive view that looks at all this data to be really able to do predictive analysis on the problems that you’re trying to solve.
So the largest problem that we see is that it’s a great goal, but without being able to put the methodology and the systems in place to capture all of that data then it’s going to only be somewhat effective. What we’re focusing on at Dassault is that again through other industries, we’ve been putting processes in place in terms of design process and manufacturing processes that allow us to have a great methodology, and we are capturing all this data now. We’re looking at ways of now bringing this approach to the semiconductor industry, and we’re starting right now through a piece of our technology called requirements-driven verification. This basically will capture the whole design process and verification process.
We can automate the capture of this data now and the very first types of analysis that we’re doing on this data is what we call a Decision Support System. Imagine that you’re doing physical IC design and you’re trying to close timing and your trying to close power, typically you run different experiments and you might run ten or twelve different P&R experiments with different constraint files, making subtle tweaks to the design. You then get a bunch of results out, but it’s often like pushing a balloon if you improve one area another area gets worse, and then you look at the next result while you fix the one that got worse but the one you thought is fixed gets worse now. The minute that you have more than two tests to look at the problem becomes exponentially difficult to try to solve.
In our SIMULIA brand there are some predictive analysis techniques that actually look at multiple groups of tests and analyze what inputs generated what outputs and with what desired results. This can now begin to guide you in a certain direction, so it’ll be able to get all your design constraint files and tell that if you use these design constraints and couple it with other design constraints from this test that you begin to assemble a view that leads you down the right path. We’re looking first at this very specific functional process to begin to make improvements in the overall IC design process to achieve design closure.
The very first step that we’re looking at right now is very much at the role level of providing the analysis capabilities to take the data analysis problem and make it a lot simpler and provide some intelligence at that level.
Q: What is the future of using big data inside of EDA tool flows going to look like?
The obvious next step is to break out from the role level and now look across the entire development platform, and when I mean developing I mean all the way from system design planning through manufacturing. Once you’ve begun to capture enough data you’re able to do different level of analysis, so it moves from looking at how to make my job better to now how do you make the entire system design process better.
If you have a project where you already know what the design results are then you also know who worked in the project, you start to have a notion of how good design teams are, you also start bringing in scheduling information that allows you to make predictions of how design teams work against certain schedules. We can start bringing issues and defects data and see how many errors that certain design teams have made versus other design teams, so you start building a much larger picture for product planning. From design to manufacturing you’re now able to really start predicting schedules at the beginning of a new design. What are my chances of achieving this schedule? Where are the problems going to be?
When you start looking at that data over time we start seeing issues pop up along the way. Well, if I need extra resources now to work on this project, then what teams have done similar projects in the past? Should I apply this engineer to help fix this problem? How is that going to impact my other projects that I have going on where I might start borrowing people from? So it goes from a role based analysis to a system analysis across a whole design team, and then across multiple design teams from a whole company.
Q: On a personal note, I understand that you’re both an engineer and a musician. How did that come about?
It’s a right brain, left brain issue. You know, after spending your days doing pure analysis, analytics and thinking about design problems then everybody needs a creative outlet. Music is one of the best ways to be creative for me. I like to be as creative as possible and let the other side of my brain work. I love it.
Follow the adventures of SemiWiki on LinkedIn HERE!