Our latest book has finally been published! A PDF version of “Prototypical II – The Practice of FPGA Prototyping for SoC Design” is now available in the SemiWiki book section. The first book “Prototypical – The Emergence of FPGA Prototyping for SoC Design” was published in 2016 and a lot has happened since then so it was high time for an update.
In this book, we look at the history of FPGA-based prototyping and the leading providers – S2C, Synopsys, Cadence, and Mentor. Initially, we look at how the need for co-verification evolved with chip complexity, where FPGAs got their start in verification, and why ASIC design benefits from prototyping technology.
My co-author this time is Steve Walters, a good friend from back in the Virage Logic days. Steve was an early employee of Quickturn, a pioneer in prototyping and emulation, which was later acquired by Cadence, so Steve knows where the emulation bodies are buried, absolutely.
Steve and I updated the first section and completely rewrote the second half. Here is the table of contents and a clip from the beginning of the book. For the greater good of the semiconductor industry! Comments are welcome, enjoy!
Part I – Evolution of Design Verification Techniques
The Art of the “Start”
A Few Thousand Transistors
Microprocessors and ASICs
The Birth of Programmable Logic
Pre-Silicon Becomes a Thing
Positioning: The Battle for Your Mind
First Pentium Emulation
Enabling Exploration and Integration
Part II – FPGA Prototyping for Different Design Stages
Design Exploration
IP Development
Hardware Verification
System Validation
Software Development
Compatibility Testing
The Art of the “Start”
The semiconductor industry revolves around the “start.” Chip design
starts lead to more EDA tool purchases, more wafer starts, and eventually to more product shipments. Product roadmaps develop to extend shipments by integrating new features, improving performance, reducing power, and reducing area – higher levels of functional integration and what is referred to as “improved PPA.” Successful products lead to additional capital expenditures, stimulating more chip designs and more wafer starts. If all goes well, and there are many things that can go wrong between the MRD and the market, this cycle continues. And in keeping with good capitalist intentions, this frenetic cycle drives increased design complexity and design productivity to feed the global appetite for economic growth.
Chip designs have mutated from relatively simple to vastly complex and
expensive, and the silicon technology to fabricate chips has advanced through rapid innovation from silicon feature sizes measured in tens of microns – to feature sizes measured in nanometers. Once visualized as ones and zeroes in a table, functions now must comprehend the execution of powerful operating systems, application software, massive amounts of data, and heretofore incomprehensible minuscule latencies. Continued semiconductor industry growth depends on delivering ever more complex chip designs, co-verified with specialized system software – in less time with relatively fewer mistakes.
New chip wafer fabs now cost billions of dollars, with production capacities in the 10’s of thousands of wafers per month – in May of 2019, TSMC announced that it would build a new wafer fab in Arizona. The total project spending for the planned new 5-nm wafer fab, including capital expenditures, is expected to be approximately $12B from 2021 to 2029, and the fab is expected to have the capacity to produce 20,000 wafers per month. [1]
One malevolent block of logic within a chip design can cause very expensive wafers to become scrap. If a flaw manages to escape, only showing itself at a critical moment in the hands of a customer, it can set off a public relations storm calling into question a firm’s hard-earned reputation as a chip supplier.
Chip design verification is like quality: it asymptotically approaches perfection but never quite achieves 100%. It may be expressed as a high percentage less than 100%, but close enough to 100%, to relegate fault escapes to the category of “outlier” – hopefully of minimal consequence. Only through real-world use in the hands of lots of customers will every combination of stimuli be applied to every chip pin, and every response be known. So, chip designers do their best to use the latest cocktail of verification techniques and tools, and EDA companies continually innovate new verification tools, design flows, and pre-verified silicon IP, in a valiant effort to achieve the elusive goal of achieving chip design verification perfection.
The stakes are very high today for advanced silicon nodes where mask sets can cost tens of millions of dollars, and delays in chip project schedules that slip new product roll-out schedules can cost millions of dollars more in marketing costs. With the stakes so high for large, sophisticated chips, no prudent leader would dare neglect investing in semiconductor process quality. Foundries such as GlobalFoundries, Intel, Powerchip, Samsung, SMIC, TSMC, UMC, and others have designed their entire businesses around producing high-quality silicon in volume at competitive costs for their customers.
So, chip design teams struggle to contain verification costs and adhere to schedules. The 2020 Wilson Report found that only about 32 percent of today’s chip design projects can achieve first silicon success, and 68 percent of IC/ASIC projects were behind schedule. [2] A prevailing attitude is that the composite best efforts of skilled designers using advanced EDA design tools should result in a good outcome. Reusing known-good blocks, from a previous design or from a reliable IP source, is a long-standing engineering best practice for reducing risk and speeding up the design cycle. Any team that has experienced a chip design “stop” or “delay” knows the agony of uncertainty and fear that accompanies these experiences. Many stories exist of an insidious error slipping through design verification undetected and putting a chip design, a job, and sometimes an entire company, at risk. The price of hardware and software verification escapes can dwarf all other product investments, and ultimately diminish a hard-earned industry leadership reputation.
Enter FPGA-based prototyping for chip design verification. A robust verification plan employs proven tests for IP blocks, and tests the fully integrated design running actual software (co-verification) – which is beyond the reach of software simulation tools alone. Hardware emulation tools are highly capable, and faster than software simulation, but highly expensive and often out of reach for many design teams. FPGA-based prototyping tools are scalable, cost-effective for almost any design, offer capable debug visibility, and are well suited to hardware software co-verification.
Also Read:
StarFive Surpasses Development Goal with the Prodigy Rapid Prototyping System from S2C
CEO Interview: Toshio Nakama of S2C EDA
S2C Raises the Bar for High Capacity, High-Performance FPGA Prototyping
Share this post via:
Comments
There are no comments yet.
You must register or log in to view/post comments.