WP_Term Object
(
    [term_id] => 159
    [name] => Mentor, a Siemens Business
    [slug] => mentor-graphics
    [term_group] => 0
    [term_taxonomy_id] => 159
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 505
    [filter] => raw
    [cat_ID] => 159
    [category_count] => 505
    [category_description] => 
    [cat_name] => Mentor, a Siemens Business
    [category_nicename] => mentor-graphics
    [category_parent] => 157
    [is_post] => 1
)

Evolution of process models, part I

Evolution of process models, part I
by Beth Martin on 02-23-2011 at 1:15 pm

Thirty five years ago, in 1976, the Concorde cut transatlantic flying time to 3.5 hrs, Apple was formed, NASA unveiled the first space shuttle, the VHS vs Betamax wars started, and Barry Manilow’s I Write the Songs saturated the airwaves. Each of those advances, except perhaps Barry Manilow, was the result of the first modern-era, high-production ICs.

During those years, researchers were anticipating the challenges of fabricating ICs that, according to Moore’s Law, would double in transistor count about every two years. Today, the solution to making features that are much smaller than the 193nm light used in photolithography is collectively referred to as computational lithography (CL). Moving forward into double patterning and even EUV Lithography, CL will continue to be a critical ingredient in the design to manufacturing flow. Before we get distracted by today’s lithography woes, let’s look back at the extraordinary path that brought us here.

ICs were fabricated by projection printing in the early ‘70s by imaging the full wafer at 1:1 magnification. But, linewidths were getting smaller, wafers getting larger, and something had to change. As chip makers dove into the 3um process node, they got some relief by using the newly introduced steppers, or step-and-repeat printing. Still, the threat of printing defects with optical lithography spurred research into modeling the interactions of light and photoresist.

In 1975, Rick Dill and his team at IBM introduced the first mathematical framework for describing the exposure and development of conventional positive tone photoresists: the first account of lithography simulation. What is now known as the Dill Model describes the absorption of light in the resist and relates it to the development rate. His work turned lithography from an art into a science, and laid the foundation for the evolution of lithography simulation software that is indispensible to semiconductor research and development today.

The University of California at Berkeley developed SAMPLE as the first lithography simulation package, and enhanced it throughout the ‘80s. Several commercial technology computer aided design (TCAD) software packages were introduced to the industry through the ‘90s, all enabling increasingly accurate simulation of relatively small layout windows. As industry pioneer, gentleman-scientist, and PROLITH founder Chris Mack has described, lithography simulation allows engineers to perform virtual experiments not easily realizable in the fab, it enables cost reduction through narrowing of process options, and helps to troubleshoot problems encountered in manufacturing. A less tangible but nonetheless invaluable benefit has been the development of “lithographic intuition” and fundamental system understanding for photoresist chemists, lithography researchers, and process development engineers. Thus patterning simulation is used for everything from design rule determination, to antireflection layer optimization, to mask defect printing assessment.

At the heart of these successful uses of simulation were the models that mathematically represent the distinct steps in the patterning sequence: aerial image formation, exposure of photoresist, post-exposure bake (PEB), develop, and pattern transfer. Increasingly sophisticated “first principles” models have been developed to describe the physics and chemistry of these processes, with the result that critical dimensions and 3D profiles can be accurately predicted for a variety of processes through a broad range of process variations. The quasi-rigorous mechanistic nature of TCAD models, applied in three dimensions, implies an extremely high compute load. This is especially true for chemically amplified systems that involve complex coupled reaction–diffusion equations. Despite the steady improvements in computing power, this complexity has relegated these models to be used for small area simulations, on the order of tens of square microns or less of design layout area.

TCAD tools evolved through the ‘90s, and accommodated the newly emerging chemically-amplified deep ultraviolet (DUV) photoresists. At the same time, new approaches to modeling patterning systems were developing in labs at the University of California at Berkeley and a handful of start-up companies. By 1996, Nicolas Cobb and colleagues created a mathematical framework for full-chip proximity correction. This work used a Sum of Coherent Systems (SOCS) approximation to the Hopkins optical model, and a simple physically-based, empirically parameterized resist model. Eventually these two development paths resulted in commercial full-chip optical proximity correction (OPC) offerings from Electronic Design Automation (EDA) leaders Synopsys and Mentor Graphics. It is interesting to note that while the term “optical” proximity correction was originally proposed, it was well known that proximity effects arise from a number of other process steps. The label “PPC” was offered to more appropriately describe the general problem, but OPC had by that time been established as the preferred moniker.

Demand for full-chip model-based OPC corresponded with the increase in computing capability. Along with some simplifying assumptions and a limited scope of prediction capability, these OPC tools could achieve several orders of magnitude of increase in layout area over TCAD tools. An important simplification was the reduction in problem dimensionality. A single Z plane 2D contour is sufficient to represent the proximity effect that is relevant for full-chip OPC. The third dimension, however, is becoming increasingly important in post-OPC verification, in particular for different patterning failure modes, and is an area of active research.

Today, full-chip simulation using patterning process models is a vital step in multi-billion dollar fab operations. The models must be accurate, predictable, easy to calibrate in the fab, and must support extremely rapid full-chip simulation. Already, accuracy in the range of 1nm is achievable, but there are many challenges ahead in modeling pattern failure, model portability, adaptation to new mask and wafer manufacturing techniques, accommodating source-mask optimization, 3D awareness, double patterning and EUV correction, among others. These challenges, and the newest approaches to full-chip CL models, affect the ability to maintain performance and yield at advanced IC manufacturing nodes.

By John Sturtevant

Evolution of Lithography Process Models, Part II
OPC Model Accuracy and Predictability – Evolution of Lithography Process Models, Part III
Mask and Optical Models–Evolution of Lithography Process Models, Part IV


0 Replies to “Evolution of process models, part I”

You must register or log in to view/post comments.