TSMC held its Open Innovation Platform Forum the other week on September 13[SUP]th[/SUP]. Each year the companies that exhibit at this event choose to highlight their latest technology. One of the most interesting presentations that I received during the event was from Solido. In recent years they have produced a number of groundbreaking machine learning-based tools for Monte Carlo, Statistical PVT, cell optimization, library characterization, and other critical verification tasks.
Simply put, their aim has been to reduce the amount of brute force simulation needed to validate designs before silicon production. Their latest offerings have come from their Machine Learning Labs. The product they introduced at OIP this year is another in this line. It is called PTVMC Verifier, which is a mouthful, but should be correctly construed to mean that they are combining analysis over PVT’s with Monte Carlo – to fully account for all meaningful operating conditions and states.
This is significant because the largest growth segments in semiconductor design each need this kind of thorough analysis to ensure silicon success. IoT and mobile are pushing the limits of lower operating voltages, which can push margins to the edge. Automotive has extremely high reliability requirements, combined with harsh environmental conditions. High performance computing demands the tightest timing again at often high operating temperatures.
With unlimited time and computing resources the easiest approach would be to perform Monte Carlo simulations on every single PVT corner. This would ensure that working silicon could be achieved within the desired sigma. In order to adapt to the realities of chip level verification, the typical flow starts out by running a large number of PVT corners to find the worst-cases. These worst-case corners are run through Monte Carlo simulations. While this saves significant time, it still suffers from the possibility that a worst case PVT at nominal may not be the worst case at the target sigma. In this event a true worst case may be overlooked – possibly leading an unexpected failure.
This problem until now has been partially addressed through the use of statistical corners. This attempts to solve the problem by running Monte Carlo first to obtain sigma distributions then applying these to PVT corners to see where they are likely to be failures. The catch is that a given sample in a Monte Carlo may move to a different probability at other PVT conditions. Thus the probability profile can change dramatically for some sets of samples as they are moved across PVT’s.
Solido is applying Machine Learning to solve this difficult issue. They have learned that Monte Carlo samples have several different behaviors relative to each other as they are run at different PVT’s. Machine Learning can be used to characterize the relative ordering of the samples based on a small number of simulation runs. This garners enough information to predict how the probability distributions will shift moving between PVT corners.
In a sequence of three cycles of simulations using a small subset of the possible corners and samples, it is possible to predict the worst cases for a design. This takes the number of simulations required down from 10’s of thousands to somewhere between 500 and 2,000, and works up to 5 sigma by default, but can go higher if needed. The reliability of the results can be verified at runtime, so there is no doubt about the integrity of the results.
I have been saying that Machine Learning will be a revolutionary force in EDA. Solido has made great headway in applying it to numerical problems. We can also expect to see more in the area of visual pattern recognition. For instance, Mentor already has some technology it is using for DRC. For more information on how Solido is applying Machine Learning technology developed in its Machine Learning Labs, please look on their website.Share this post via: