WP_Term Object
    [term_id] => 78
    [name] => Blue Pearl Software
    [slug] => blue-pearl-software
    [term_group] => 0
    [term_taxonomy_id] => 78
    [taxonomy] => category
    [description] => 
    [parent] => 14433
    [count] => 5
    [filter] => raw
    [cat_ID] => 78
    [category_count] => 5
    [category_description] => 
    [cat_name] => Blue Pearl Software
    [category_nicename] => blue-pearl-software
    [category_parent] => 14433

Verification 3.0 Holds it First Innovation Summit

Verification 3.0 Holds it First Innovation Summit
by Randy Smith on 03-26-2019 at 5:00 am

23187-v30kickoff.jpgLast week I attended the first Verification 3.0 Innovation Summit held at Levi’s Stadium in Santa Clara along with about 90 other interested engineers and former engineers (meaning marketing and sales people, like me). There was a great vibe and feel to the event as it exuded an energy level that I have not felt at an EDA event in years. The attendees included longtime EDA veterans as well as a few newcomers. Perhaps more important, the list of participating companies was quite long including speakers from Avery, Breker, Metrics, TV&S, Imperas, OneSpin, Vayavya, Agnisys, Concept, Methodics. Vtool, and Verifyter. Blue Pearl, Willamette HDL, and XtremeEDA were also supporting the event. Quite a collection of verification experts. All these companies gave presentations, spoke to attendees in a tabletop gathering at the end of the event (with great food!), or did both.

EDA industry luminary, Jim Hogan, who has been a driving force behind the Verification 3.0 effort, kicked off the event. Jim is involved in several of the companies supporting this effort as a consultant, investor, and board member. Unfortunately, due to traffic, I missed Jim’s remarks, but we did get a chance to talk at the reception later in the evening where Jim told me, “There are some major themes that Joe outlined in his talk. It’s time to take a new approach to verification, that’s why we called it verification v3.0. We outlined this in an article last year.” Herding so many start-ups is quite a challenge, but Jim is off to a terrific start.

Next up was Joe Costello, former Cadence CEO and the “Tony Robbins of EDA”. I worked at Cadence during Joe’s tenure and his infectious smile, positive attitude, and fervent enthusiasm were clearly all still in effect. Joe laid out a clear case for the likely path of verification solutions over the next five years. He discussed the macroeconomic factors and the design trends that are driving a new approach to verification solutions and then suggested a target opportunity for the participating companies.

The first macroeconomic factor mentioned by Joe is the move to cloud computing. The cloud computing market is already in the same order of magnitude as the entire semiconductor market, measured in the hundreds of billions of dollars per year. Yet, most EDA companies have been slow to make use of these services. Cloud-based EDA solutions would stop semiconductor designers from also needing to be experts at running their own massive compute farms. This also goes hand-in-hand with the second macroeconomic factor, SaaS (software as a service). Deploying EDA tools as a service is far simpler to do in a cloud environment where both the use of the hardware AND software would be measured. This allows users to only pay for the tool usage they use, rather than pay for (and try to predict) their maximum needed capacity of licenses.

So, you might be thinking that these are just infrastructure issues, it is not the next algorithm or paradigm to solve verification issues. What I can tell you is the biggest hurdle of semiconductor design today is the COST of verification. That cost is in licenses and hardware – but it is especially important in headcount. Having spent some time the last few years helping firms on their recruiting challenges, I know for certain that there are not enough verification engineers available to meet the semiconductor industry’s current needs. So, improving efficiency in verification is critical to improving the results of verification as well as reducing its costs.

Improving the efficiency of verification also can mean building more platforms which are specific to certain types of designs. Joe specifically mentioned the fledgling market for design specific processors. Domain specific processors are coming about due to the end of Moore’s Law and Dennard scaling, as well as the concerns about efficiently scaling solutions. Building processors for specific applications is an approach to improve the efficiency and results of designs build for specific problems. Joe specifically mentioned ‘RISC-V’ as an example of open processors enabling this approach. Now doubt, ARM could also go down this path to some extent as well.

Which leads us to this: If you are going to have domain specific architectures, then can’t you develop specific verification environments to aid in the design and verification of those designs? For example, why not build an environment around a specific processor (that also supports extensions) including the IP and verification best practices specific to that application – an environment supporting ISO 26262 for functional safety? An environment knowledgeable of video/audio codec standards? The list is long. Beyond the changes this can drive in the verification industry, the opportunities in semiconductor IP are enormous.

This event was well worth attending, and if you are interested in verification, it would be well worth your time to attend the next Verification 3.0 event whenever that might next happen. You can check out the website at https://verification30.com/. I heard slides might be online next week.

Share this post via:


There are no comments yet.

You must register or log in to view/post comments.