J38701 CadenceTECHTALK Automotive Design Banner 800x100 (1)
WP_Term Object
(
    [term_id] => 157
    [name] => EDA
    [slug] => eda
    [term_group] => 0
    [term_taxonomy_id] => 157
    [taxonomy] => category
    [description] => Electronic Design Automation
    [parent] => 0
    [count] => 4046
    [filter] => raw
    [cat_ID] => 157
    [category_count] => 4046
    [category_description] => Electronic Design Automation
    [cat_name] => EDA
    [category_nicename] => eda
    [category_parent] => 0
)

A Brief History of Functional Verification

A Brief History of Functional Verification
by Paul McLellan on 04-13-2014 at 3:00 pm

Usually these brief history pieces are totally written by the SemiWiki blogger whose name is at the top. Often me since that was how I prototyped book chapters (buy). Well, OK, I did actually write this but it is completely cribbed from a presentation earlier this week by Wally Rhines who gave a sort of keynote at the announcement of Mentor’s Enterprise Verification Platform. He is the CEO, you didn’t need me to tell you that did you?

 Verification started with…nothing. This was the rubylith era. You did manual design, built it, tested it, redesigned it, rinse and repeat. A friend of mine from that era said the worst thing was finding a bit of rubylith lying on the floor underneath where all the big sheets were hung. You had no idea if it was off your design and pretty much all you could do was build chips and hope.

Then transistor level simulation started with Ron Rohrer’s CANCER program. Ron seems to be a child prodigy in circuit simulation, getting his PhD before most people get their bachelors. Then the algorithms morphed into a new program, Simulation Program with Integrated Circuit Emphasis. Wait…that spells SPICE. They presented it in 1973, put it in the public domain, and within a very short time every academic department and every semiconductor company was using it and making derivatives. The basis sparse matrix algorithms survived for a long time and allowed much larger circuits than ever before to be simulated.

But larger designs required gate-level simulation. Lots of companies provided that, Mentor, Daisy and Valid (the DMV) in particular. Mentor’s solution was Quicksim. At VLSI ours was VSIM then TSIM.

In 1983 the DOD awarded contracts for the acronym within an acronym VHDL, which stood for VHSIC hardware description language and VHSIC was very high speed integrated circuit. There were other languages, most notably GHDL at Genrad, an outgrowth from Brunel University in the UK where the HILO simulator was originally developed in which it ran. Developed by Phil Moorby who would go on to develop…

 …Verilog, which came along in 1985 and took off since Synopsys adopted it as the language for RTL synthesis. Cadence acquired Gateway (the owners of Verilog) in 1989 and it became an IEEE standard in 1995. Gordon Moore noted that “the only reason for VHDL was to force Cadence to put Verilog into the public domain.”

In those days simulation performance benefited from two things. Every year or so Intel upped the clock rate on the processors. But ModelSim (and other simulators have similar numbers) also improved the algorithms by a factor of 10 on top of that.

Simulator were now so fast the big problem was creating stimulus. The focus switched to methodologies. This Wally reckoned is verification 2.0. The industry, amazingly, since EDA never wants one standard when it can have three, converged on SystemVerilog. Then base class libraries were also standardized on UVM. As a result, verification has grown from $724M in 2010 to $984M in 2012 and probably over a billion now. The really big growth area has been emulation.

 Emulation has gone from a very expensive option for a few groups in the biggest companies to mainstream. It is still not cheap but it is so fast that apparently the fastest simulation cycle per $ is now emulation not simulation. Plus, if you are designing a 500M gate chip, your simulation is going to take a geological timescale to complete. There is no alternative.

Chips now contain lots of software so the focus is on simulating software and hardware together. Multiple cores, huge software loads, need to boot iOS/Linux/Android/Windows before it even gets interesting. Embedded software headcount is surging. And remember that Intel giving us higher clock speeds every year. Fuhgeddaboutit. You can have lots of cores instead, which is great for DRC but not so great for simulation where everyone needs to know the time.

 And formal came along somewhere during this period and, while it started off very esoteric and needed a PhD to use it and it could only handle like 1000 gates…now it is mainstream too.

So people want all these technologies to look the same. Simulation. Emulation. Formal. FPGA prototypes. Virtual platforms. A single debugger. A single code coverage database. Single. Single. Single.

That was what Mentor announced a couple of days ago. But it is a trend. Synopsys announced Verification Compiler at SNUG which is similar. And Cadence has been talking about a similar unification for the last year or so.

In a short time all the big three will have a unified verification environment that combines everything without requiring a specialist in each domain (formal, emulation, coverage, software etc) to get things up and running. Nowadays there are only design teams and they need to be self-sufficient.


More articles by Paul McLellan…

Share this post via:

Comments

0 Replies to “A Brief History of Functional Verification”

You must register or log in to view/post comments.