It used to be true that a foundry or fab would create a set of DRC files, provide them to designers, and then the process yield would be acceptable, however if the foundry knows more details about the physical implementation of IC designs then they can improve the yield. Using a digital twin of the design, process and metrology steps is a better methodology that understands the bidirectional information flow between the physical synthesis tool and fabrication steps. A Machine Learning (ML) framework handles the bidirectional flow while using accurate predictive models, efficient algorithms, while running in the cloud. Ivan Kissiov of Siemens EDA presented on this topic at #59DAC, so I’ll distill what I learned in this blog.
A digital twin for the fab engineer is made up of process tools, metrology tools, a virtual metrology system along with a recipe.
Models for the digital twin have to predict how new ICs will be manufactured, and its response to different processes and tool variations. Goals for this approach are improved yield at lower cost, better process monitoring, higher fault detection, and superior process control.
Some process effects are only seen at the lot level, while others show up at the wafer level, and some only appear at the design feature level, so being able to fuse all of this data together becomes a key task. With a digital twin a design can be extracted, along with process and metrology models to produce a predictive process model.
An example data fusion flow from AMD shows how post-process measurements for feed-forward control go into the process model, and how the equipment model returns a modified recipe, along with in-situ sensors providing automatic fault detection.
Data fusion ties into machine learning for each of the fab process and metrology steps:
Delving inside of the train, test, validate stage there is data processing, feature engineering, model training, and finally model quality metrics:
Statistical Process Control (SPC) has been used in fabs for decades now, and with some adjustments has been modified into Advanced Process Control (APC), where Run-to-Run Control (RtR) and Fault Detection and Classification components are shown below in a flow from Sonderman and Spanos:
Complex deep learning models can analyze IC fab data using Shapley value explanation to evaluate input feature importance of a given model. During the feature engineering phase, the challenge is feature extraction from image, and Principle Component Analysis (PCA) is the feature extraction method used.
Fabs have used a data mining approach for many years now, and it’s the process of extracting useful information from a great amount of data. This data can help process engineers discover new patterns, gaining meaning and ideas to make improvements.
Machine Learning in contrast is the process of finding algorithms that have improved results, derived from the data. With the ML approach, we now have machines learning without human intervention. The site MLOps has much detail on the virtues of using ML, and the flow used in fabs is shown below:
Summary
Data is king, and this truism applies to the torrents of data streaming out of fabs each minute of the day. As fabs and foundries adopt the digital twin paradigm in design, process and metrology, there is a bidirectional flow of information between the physical synthesis tool and and each step of the wafers going through a fab. Using a machine learning framework to create predictive models with efficient algorithms helps silicon yield in the fab.
Related Blogs
- Calibre, Google and AMD Talk about Surge Compute at #59DAC
- What’s New With Calibre at DAC This Year?
Comments
There are no comments yet.
You must register or log in to view/post comments.