BannerforSemiWiki 800x100 (2)

Kathryn Kranen Interview in San Jose Mercury News

Kathryn Kranen Interview in San Jose Mercury News
by Paul McLellan on 03-07-2012 at 1:22 pm

There is an interview in the San Jose Mercury News with Kathryn Kranen, Jasper’s CEO. Of course the Mercury is a general newspaper and can’t expect most of its readership to have a clue what EDA is, never mind formal verification. It’s a similar problem to the one we all have when we try and explain to our families just what we do. Steve Johnson does his best to understand.

Q: Say a company designs a new chip for an automobile. How would your formal-verification software prevent the design from having a glitch that blows a car fuse, for example?
A: What we do is all mathematical. We can just say there is a rule, or it’s called a property, that this fuse should not blow. And then the software will automatically reverse engineer what conditions could cause that to fail. It tells you if you were driving in reverse while you moved your seat and shifted into park, the fuse will blow. It’s very reliable. It solves problems about connections inside the chip.

I knew that Kathryn and her husband Kevin both had initial “K”. I didn’t know their kids did. I think sending lots of mail to “K. Kranen” could cause some fun.

The article is here.


Clock Domain Crossing (CDC): Survey Says

Clock Domain Crossing (CDC): Survey Says
by Paul McLellan on 03-06-2012 at 11:30 pm

I had no idea that there was a clock domain crossing (CDC) linkedIn group but indeed there is. Richard Brabant has set up a survey to see which tools people are using.


The graph is somewhat confusing since, for example, Cadence Conformal is currently at zero but has a significant looking bar. But far and away the market leader (in this very limited “market”) is Atrenta’s SpyGlass product, with Real Intent a comfortable second.

If you click over to the survey and vote then you can see the current numbers since the graph I have is presumably already out of date.

Details of SpyGlass CDC, Atrenta’s clock domain crossing verification tool, are here (or click on the banner at the bottom).


IC Custom IP Blocks – EM and IR Drop Effects

IC Custom IP Blocks – EM and IR Drop Effects
by Daniel Payne on 03-06-2012 at 5:33 pm

Designing custom IP blocks is a challenge at the transistor-level and I wanted to learn what the recommended methodology and EDA tool flow was at Synopsys. They have a webinar that you can register for and it takes 30 minutes to learn what they have to say, or you can read a White Paper. If you cannot spare that much time, then my summary should answer some of your initial questions in about 10 minutes.
Continue reading “IC Custom IP Blocks – EM and IR Drop Effects”


Test Synopsys offensive in VIP and try the quiz

Test Synopsys offensive in VIP and try the quiz
by Eric Esteve on 03-06-2012 at 11:33 am

I have recently blogged about Synopsys offensive in the Verification IP market. Did Synopsys again launched a new product, or announced a new acquisition? This would be a serious topic to blog, but today’s blog is closer to gaming than market analysis. Sometimes it’s good to have fun, even if the topic is serious! In fact, Synopsys has launched a quiz, the questions are centered about Verification and protocols: Ethernet, PCI Express, AXI4… I tried it, and I realized that I was certainly missing know how about these protocols… Do you want to check your VIP level of knowledge? Just go here!

The above histogram is exhibiting a perfect Gaussian distribution, showing that the average player knows 50% of the right answer. Or, if you prefer, that half of the people who have tried the quiz have given a wrong answer once every two questions… If you plan to do this quiz, maybe you should review the protocol standards, listed in the picture below as well as the Discovery product brief. And keep in mind that the Discovery related questions have been created to highlight the advantages of the product (see my previous blog, or the list below), this may help you to correctly answer these questions.

  • Synopsys Discovery VIP speeds and simplifies verification of the most complex system-on-chip (SoC) designs.
  • Synopsys Discovery VIP offers greater performance, debug and coverage management features, ease-of-use and ease-of-integration for complex SoCs.
  • Synopsys Discovery VIP is written entirely in SystemVerilog, includes native support for UVM, VMM, and OVM, and is compatible with all related verification environments.
  • Synopsys Discovery VIP supports all major simulators.
  • Included with Discovery VIP, Protocol Analyzer enables engineers to quickly understand, identify and debug protocols in their designs.

How did I score the quiz? That’s a very good question! To answer to it, I would say that I spend more time surfing the web for information, and blogging, than carefully reading the various protocol standard documentation. I hope it’s a fair answer!

From Eric Esteve from IPnest


Designing ARM Powered High Performance SoCs on 28nm and 20nm!

Designing ARM Powered High Performance SoCs on 28nm and 20nm!
by Daniel Nenni on 03-06-2012 at 9:17 am


Last week I had an interesting meeting with GLOBALFOUNDRIES executives Kevin Meyer and Mojy Chian. It certainly seems that GFI has turned a corner! I will be in Dresden next week for DATE 2012 and will also visit the GFI Fab there. 28nm and 20nm are on track so expect an aggressive implementation plan from GFI this year.
Continue reading “Designing ARM Powered High Performance SoCs on 28nm and 20nm!”


OpenAccess DB – Productivity and Beyond!

OpenAccess DB – Productivity and Beyond!
by Pawan Fangaria on 03-05-2012 at 10:00 pm

As I have been watching the developments in EDA and Semiconductor industry, it is apparent that we remain fragmented unless pushed to adopt a common standard mostly due to business reasons. Foundries are dictating on the rules to be followed by designs, thereby EDA tools incorporating them. Also, design companies needed to work with tools from different vendors. As a result various exchange formats appeared at various levels – layer, device, block, design and so on; first to define and satisfy the rule and second to exchange the information between different tools. Yet, the industry needed a common database for tighter integration and interoperability between various tools, eliminating data translation between formats. OpenAccess is an open standard database, owned and distributed by SI2 ; available at least after about 8-10 years of wait.

There are multiple advantages of using OpenAccess database. I would not go into the details of its data exchange, capacity and performance benefits, but how it tremendously improves productivity in terms of design and tool development and robustness of the flow. SI2 provides database along with its access APIs; so any tool developed in-house or by 3[SUP]rd[/SUP] party on this database can be tightly integrated, thereby enabling a robust design flow based on multiple tools but working on the same database. This also eliminates multiple data translations through various formats which in turn eliminate unwanted errors and troubleshooting overheads. While talking to Brady Logan (Director of Business Development at SpringSoft), he showed me a demo of Laker environment. This reflects true use and advantage of OpenAccess.

The Laker (from SpringSoft) and Calibre (from Mentor Graphics) Realtime Runtime Model enables two tools to run concurrently on the same data base to provide users with real time sign off quality feedback on edits and DRC fixing. The use model’s elimination of Stream In and Stream Out has a proven 10X productivity gain over older methodology. Brady tells me that this use model has proven value at advance nodes where edits and fixes can create numerous violations of complex conditional rules. It’s obvious that all of the time consuming cycles through Stream during edit-verify-fix iterations go away.

Overall flow in Laker system based on OpenAccess looks like this –

Profitability – Imagine the amount of resources being spent by every company in maintaining every format and its translation to other formats and databases. Also, increase in design cycles due to formats. If all that expense can be saved, it will definitely increase in the bottom line of the whole industry as such and the resources can be deployed at other top-end challenges. OpenAccess enables saving that unnecessary cost. It must be noted that Cadence invested significantly in the making of this database though.

Further Extensions – In one of my last article, I talked about design level analog constraints to be standardized. If OpenAccess database can be extended to include these constraints, then analog designs, even at the level of schematics and layouts can be shared with other parts of designs and tools from different vendors on the same database, thereby easing integration, edit and verification of the complete design together. Another good extension would be for 3D-IC to assimilate designs from various sources, specifically for information about TSVs, partitioning at different planes, power distribution networks and stress (thermal and mechanical) analysis; and tools for different usage working on the same database. For power distribution, CPF (Common Power Format) and UPF (Unified Power Format) formats are available, but they need to be unified in a single format first for greater interoperability.


By Pawan Kumar Fangaria
EDA/Semiconductor professional and Business consultant
Email:Pawan_fangaria@yahoo.com


Not me. Who owns IP quality?

Not me. Who owns IP quality?
by Paul McLellan on 03-05-2012 at 4:32 pm

Now that the dominant approach to building an SoC is to get IP from a number of sources and assemble it into a chip, the issue of IP quality is more and more critical. A chip won’t work if the IP doesn’t work, but it is quite difficult to verify this because the SoC design team is not intimately familiar with the IP blocks since nobody on the team designed them.

At DATE next week in Dresden there is a panel session on just this topic, moderated by Gary Smith. It takes place from 13:15 to 14:15 (or 1.15pm to 2.15pm for Americans) in the Exhibition Theater.

Participating on the panel are:

  • Fahim Rahim, director of engineering at Atrenta in Grenoble
  • Simon Butler, CEO of Methodics in San Francisco
  • Gabriele Saucier, president of D&R in Grenoble
  • Andreas Bruning, director of the technology office of ZDMI in Dresden
  • Gerd Teepe, director of design enablement for GlobalFoundries in Dresden

While there are many tools available to help verify, debug, assemble and otherwise manipulate IP, there’s a distinct lack of a solid design data management system to address the specific needs of SoC designers. As a result, IP often suffers from a bad rap regarding quality. Users blame providers, and tool vendors and CAD managers are often caught in the middle, trying to put together solutions that track changes, use models and offer some degree of version control. Complicating matters is that the term “IP quality” has different meanings to different people – is it 1) the functional correctness of the IP – does it work they way it is supposed to (i.e. bug free); 2) or defined by its ability to do what is expected with respect to design parameters – power, timing, area, etc?

The panel will discuss what needs to be done to improve the design environment from the perspective of all the players

And if you are at DATE in Dresden, there is an interesting piece of “design re-use” that is worth a visit, the Frauenkirche, destroyed by bombing in 1945. The first time I went to Dresden was still a ruin, but it has been completely rebuilt. The original was built in 1726-43 and has been rebuilt using the original plans, many of the original stones (you can tell the old from the new because they are charred). In 2003 it was half built when I took the second photo. It reopened in 2005, 60 years after it collapsed. Wikipedia page here.


Atrenta/TSMC Soft-IP Alliance: 10 companies make the grade

Atrenta/TSMC Soft-IP Alliance: 10 companies make the grade
by Paul McLellan on 03-05-2012 at 7:30 am

Last May, Atrenta and TSMC announced the Soft-IP Alliance Program which uses SpyGlass and a subset of its GuideWare reference methodology to implement TSMC’s IP quality assessment program. TSMC requires all soft-IP providers to reach a minimum level of completeness before their IP is listed on TSMC online. Since TSMC is so dominant in the foundry business right now (Global struggling with process, Intel talking the talk but not yet really walking the walk, UMC…whatever happened to them anyway?) getting approved and listed with TSMC is extremely important.

Atrenta put everything needed to meet TSMC’s requirements in an IP Handoff Kit. Under the hood this uses SpyGlass’s RTL analysis suite to check for syntax and semantic correctness, simulation-synthesis mismatches, connectivity rules, clock domain crossings, test coverage, timing constraints and…lots more.

Suk Lee of TSMC (my successor at running IC marketing when we were both at Cadence) sees this as measurably improving IP quality. Of course TSMC isn’t directly responsible for IP quality but if IP fails and chips don’t go into production TSMC don’t make any money. Anyway, ten companies have now jumped through all the hoops and qualified their IP for inclusion in the TSMC 9000 IP library.

The companies in this initial program are a veritable who’s who of the IP world (with the notable exceptions of ARM and Synopsys). In alphabetical order so as not to offend anyone:

  • Arteris (NoC)
  • CEVA (DSP cores)
  • Chips&Media (video IP)
  • Digital Media Professionals (graphics IP)
  • Imagination Technologies (GPU cores)
  • Intrinsic-ID (security IP)
  • MIPS Technologies (CPU cores)
  • Sonics (NoC)
  • Tensilica (reconfigurable processors and cores)
  • Vivante (GPU cores)

Now that the dominant way to build an SoC is through assembling IP, the issue of IP quality is is a huge problem and a mixture of tools, methodologies, standards and certification is for sure the way to address it.


Verdi’s 3rd Symphony

Verdi’s 3rd Symphony
by Paul McLellan on 03-05-2012 at 7:00 am

The first version of the debug platform Verdi (then called Debussy) dates back to 1996 over 15 years ago. The second version was released in 2002. And now SpringSoft is releasing the 3rd version Verdi[SUP]3[/SUP]which is a completely new generation. A tool environment like Verdi seems to need to be completely refreshed about every 5 or 6 years to take account of the changes in the scale of design and the issues which have become important.

The motivation for creating Verdi[SUP]3[/SUP] is to fit in with how engineers on today’s design teams work. The debug tasks vary widely depending on the company’s methodology and flow, the attributes of the design, what needs to be done, and the engineer’s job-function, experience and style. There are three themes in the changes to the platform:

  • new user-interface and personalization capabilities
  • open platform for interoperability and customization
  • new infrastructure for performance and capacity improvements

Personalization means tailoring the appearance and layout of the tool to the way that you want to work to accomplish the task at hand.


Customization is the capability to change or add to the functions available such as adding automation to perform a repetitive task, or integrating proprietary tools into the environment. Of course there is also functionality available from 3rd parties too, not to mention the VIA website announced last year. The extensibility is built on top of VIA (Verdi Interoperability Apps) making it easy to plug-in in-house tools, add items to menus, create hot-keys and so forth. For example, above the environment has been configured for power analysis.

The new user-interface is more modern and cleaner. Gone are the overlapping windows with information regularly buried out of sight. Instead the interface is now tiled so everything is within sight. Plus there are some big incremental capabilities such as being able to have multiple source-code windows open at the same time (the older version of Verdi only allowed one). A powerful search capability makes finding anything easy. Above is a comparison of the old user interface (on the left) and the new (on the right).

The basic infrastructure has been refreshed. A new parser lifts restrictions from the old one, is multi-threaded and generally faster. The FSDB (simulation data) has been compressed by 30% compared to the previous version, and also reads twice as fast on a typical 4-core configuration.

So, Verdi[SUP]3[/SUP] is a rebuild of Verdi from the ground up, with major developments in productivity, a much richer capability for customization and higher performance and capacity.

Verdi[SUP]3[/SUP] is in general beta which means all existing customers have access today and can go and download the beta release. Official first customer ship is in April.

If you are just interested in finding out more, the Verdi[SUP]3[/SUP] product page is here.