SILVACO 073125 Webinar 800x100

Enabling 3D-IC Integration

Enabling 3D-IC Integration
by Daniel Nenni on 07-10-2012 at 9:00 pm

stevesmith80x95

As 2D device scaling becomes impractical, 3D-IC integration is emerging as the natural evolution of semiconductor technology; it is the convergence of performance, power and functionality. Some of the benefits of 3D-IC, such as increasing complexity, improved performance, reducing power consumption and decreasing footprints, are proven and readily understood. Other reported benefits, such as improving time-to-market, lowering risk and lowering cost, still need to be realized before 3D-ICs become a commercially viable alternative to traditional 2D architectures. The availability of Synopsys’ silicon-proven tools and IP is an important contribution to deploying 3D-IC integration technology in the semiconductor industry.

Web event: Enabling 3D-IC Integration
Date: July 18, 2012
Time:10:00 AM PDT

Duration: 45 minutes + Q&A

In this webinar, a guest speaker from Xilinx will introduce the challenges of designing for large capacity and performance and how Xilinx is innovating using Stacked Silicon Interconnect technology to deliver higher levels of integration and flexibility in their FPGA products

Speakers:


Steve Smith

Senior Director, 3D-IC Strategy and Marketing, Synopsys

Steve Smith is currently responsible for Synopsys’ 3D-IC strategy and marketing. He has been with Synopsys for 15 years, having served in various functional verification and design implementation marketing roles. He has worked in the EDA and computer industries for more than 30 years in a variety of senior positions including marketing, applications engineering and software development.


Shankar Lakka

Director of IC Design, Full-Chip FPGA Integration Group, Xilinx

Shankar Lakka is the Director of IC Design in the Full-chip FPGA Integration Group at Xilinx. He has been at Xilinx for more than 16 years, and has held various positions in the CPLD and FPGA divisions and led multiple projects across multiple sites. Shankar recently led the design and full-chip integration of Xilinx SSI devices. He holds 14 U.S. patents.


SNUG in Asia, US East Coast

SNUG in Asia, US East Coast
by Paul McLellan on 07-10-2012 at 8:05 pm

If you are in Asia then the Synopsys user group SNUG is coming up, soon in Japan and next month in China. Actually if you are in India I’m afraid you already missed it last month, just after DAC.

SNUG Japan is on 12th July in a couple of days time from 10am until 8pm in Tokyo.

In China there are 3 between August 14th and 21st

  • Beijing 北京
  • Shanghai 上海
  • Shenzhen 深圳

Details (in Chinese) here.

Also slipped in there is SNUG Singapore on August 17th. Details (English) here.

SNUG Taiwan is August 28th to 29th in Hsinchu. Details (English) here.

Then in September, the East Coast of North America has its turn:

  • Boston on September 6th. Details here.
  • Ottowa on September 10th. Details here.
  • Austin (OK, not really East Coast) on September 28th. Details here.


Formal Going Mainstream

Formal Going Mainstream
by Paul McLellan on 07-10-2012 at 7:29 pm

In Mike Muller’s keynote at DAC he wanted to make formal approaches an integral part of writing RTL. After all, formal captures design intent and then, at least much of the time, can verify whether the RTL written actually matches that intent. Today, formal is not used that way and is typically something served “on the side” by specialist formal verification experts. It is still very valuable and regularly finds issues, especially corner cases, that simulation has missed. However, by letting the design engineers get away without having to document their intent it risks missing problems, since the formal experts inevitably have to deduce some of the intent from the RTL they are verifying, which is obviously a circular loop with potential for letting bugs escape into the wild.

The big EDA companies all have a formal solution of some sort, but they are unlikely to be the ones spearheading this. They all have simulators and simulation verification environments too, and will quite happily sell you as many licenses as you want. Indeed, if a few dozen formal licenses turn out to substitute for a server farm with a bazillion simulation licenses, it is not necessarily even good business for them to encourage the transition.

Jasper, however, only has a formal product line. If you use a lot of simulation as an alternative they don’t participate in that business. Of course simulation is not going away and Jasper’s approach is to combine metrics from formal verification with metrics from simulation-based verification to accelerate verification closure. In particular, there is never any point in running simulation to partially verify some aspect of a design that formal has already proven; it’s just a waste of computer time.

The metrics for formal verification come in two flavors. Firstly, how complete the coverage is based on the properties and the stimuli. Secondly, how successful JasperGold (or whatever formal tool you are using) is at proving those properties, which may be fully proven or a bounded proof (or it finds a counterexample which obviously doesn’t contribute to verification closure directly, but exposes an issue which must be fixed). This is all then integrated into a coverage database to include unreachable targets that do not need to be verified, along with coverage information from formal analysis that do not need to be re-verified using simulation.

The first way to use this approach is to establish the completeness of the formal testbench. In particular, analyzing:

  • Dead code
  • Branch coverage
  • Statement coverage
  • Expression coverage

After the formal analysis, properties may be completely proven or only have a bounded proof. This means that only part of the reachable state-space was analyzed and that no violation was detected there. For example, all states within K cycles of the reset were examined and no violation was detected. Obviously this means that problems might occur after K cycles.

This approach allows confidence that the verification is not over-constrained (relying on an external property that still has to be proved) and allows proven (and perhaps bounded proven) aspects of the design to be eliminated from the simulation verification plan.

This information can then be combined with information in a simulation coverage database such as UCDB to merge all the coverage metrics into a single verification coverage metric that combines formal and simulation approaches.

These features in JasperGold will be released to beta during the second half of this year with production either late this year or in the first half of next year.


Intel Opens a New Front with ASML

Intel Opens a New Front with ASML
by Ed McKernan on 07-10-2012 at 4:00 pm

Behind great humor often lies irony. In the midst of a struggle by the European Union to extract $1.3B from Intel in an ages old Anti-Trust case, the latter makes a strategic move to embolden the Dutch firm ASML to accelerate the development of 450mm and EUV and thus save a continental jewel. What now say EU? When disfunction and bankruptcy abound, beware the need of sovereigns to extract not pints but gallons of blood. Intel sees an end game at hand, not today but in just a couple of years and it plays into its plans to win all of mobile: including Apple and Samsung. They parry the EU assault with a massive $4B investment and prepare to watch the poker players ante up or fold.

Intel Always Fights a Multi-front war knowing that it eventually wears down the enemy. Please, please we don’t speak of enemies unless we are in the realm of politics! However, one should be aware that without TSMC there is no Qualcomm, nVidia, AMD, Broadcom, Marvell and the rest of the ARM camp (especially ARM). And what of Apple and Samsung, the two leaders of the mobile Tsunami who will have 80%+ of the Smartphone and Tablet market by the New Year? They will have a choice to make in which the first one who blinks will have the opportunity to be years ahead of the other.

It is simple mathematics. Assume, conservatively that Intel is two years ahead of TSMC. Now presume Intel, conservatively launches 450mm two years ahead of TSMC, then it is like a 4 year lead in process technology. Now input your die sizes and run the cost models. It is daunting having to stare up at the Matterhorn before the climb begins.

We have learned in the past 6 months that Smartphones and Tablets are demanding leading edge process technology (Qualcomm sold out this year on 28nm 4G LTE chips). This was the one doubt that I had as to whether Qualcomm, nvidia and the rest of the ARM camp were safe in the foundries at an n-1 node while Intel played catch up with a true low power processor and baseband functionality. Intel can now force the game forward and even Apple will now have to consider how wise it is to hang back in older processes. Some amount of their processors will need to step up to the leading edge for cost and performance reasons.

The news articles from yesterday stated that ASML was open to additional investments from other foundries (i.e. TSMC and Samsung). I can see Samsung stepping up. TSMC is an extension of Qualcomm, Broadcom, nVidia and others. They will likely have to devise new long-term agreements from their partners that requires them to pony up dollars for the ASML investment. Or alternatively does Qualcomm write a check to ASML?Does Apple?

The maneuvers lately point to every survivor going vertical, however now we are looking at two separate vertical models. There is the device vertical model with LCD screens, NAND Flash, enclosures etc.. that Apple and Samsung are very adept at. In last weeks blog I mentioned how Intel was funding Taiwanese panel makers to guarantee supply for ultrabook manufacturers (likely at the expense of AMD and nVidia). Now we have Intel letting the world know that being a MAN in the semiconductor industry requires owning more than just fabs. Real Men must now invest in the semiconductor R&D tool chain. The Question that Wall St. should ask is the following: What is the total value that will derive 4-5 years down the line from an investment in ASML’s R&D?

FULL DISCLOSURE: I am Long AAPL, INTC, QCOM, ALTR


SPICE Timing Correlation for IC Place and Route

SPICE Timing Correlation for IC Place and Route
by Daniel Payne on 07-10-2012 at 10:35 am

SPICE circuit simulation is used for transistor-level analysis while Place and Route tools are typically used to connect cells and blocks of an SoC, so why would there be a connection between these two EDA tools?

I read a press release today from ATopTech and Berkeley Design Automation that talked about how SPICE and P&R are connected, so I contacted Eric Thune of ATopTech to learn more. Eric has worked at: Apache Design Solutions, I2 Technologies, Synchronicity, Synopsys and TI. Continue reading “SPICE Timing Correlation for IC Place and Route”


High-Productivity Analog Verification and Debug

High-Productivity Analog Verification and Debug
by Daniel Nenni on 07-08-2012 at 10:40 pm

See how Synopsys’ advanced analog verification solution can dramatically increase your verification productivity with CustomExplorer Ultra, along with CustomSim and CustomSim-VCS. CustomExplorer Ultra is a comprehensive simulation and debug environment for analog and mixed-signal design verification.

Web event: High-Productivity Analog Verification and Debug with CustomSim and CustomExplorer Ultra
Date: July 11, 2012
Time:10:00 AM PDT

Duration: 45 minutes + Q&A

REGISTRATION

This webinar demonstrates an advanced verification methodology using CustomExplorer Ultra with CustomSim and CustomSim-VCS that enables highly-productive verification and debug of analog and mixed-signal designs. CustomSim and CustomSim-VCS provide fast simulation engines while CustomExplorer Ultra is a complete verification environment for managing simulation corner and Monte Carlo setup, a flexible simulator interface, multiple testbenches, and interactive cross-probing with popular design environments, such as Galaxy Custom Designer and Virtuoso ADE for fast circuit debugging.

Speakers:

Duncan McDonald
Product Marketing Manager, Synopsys

Duncan has more than 20 years of experience in EDA, holding positions in engineering, sales, and marketing all related to analog and mixed-signal design. Duncan is the author of 3 U.S. patents and holds degrees from UC Berkeley and the University of Santa Clara.


DAC 2012 Cheerleader Controversy!

DAC 2012 Cheerleader Controversy!
by Daniel Nenni on 07-08-2012 at 9:00 pm

First, I must say that I’m biased. I like Cheerleaders, they are lots of fun, I even married one. Second, I’m not a fan of Peggy Aycinena. She has been on her EDA feminist rant for years now and I have been targeted multiple times. My solution has been to ignore her and any publication that supports her but this time she has gone too far.

It first started when Paul McLellan posted a blog on SemiWiki about the 49er Cheerleaders appearing at DAC 2012. What a great idea! The blog was deleted shortly thereafter and I was told by Paul that as it turns out the 49er Cheerleaders will not be attending DAC 2012. Bummer I thought. Even my wife, who attended DAC 2012, was disappointed.

Then I read an article by Mike Demler:

As the industry continues to shrink, can EDA bring sexy back?

Mike is a great guy, I’m a fan of his site, he is very credible:

DAC organizers made some initial attempts to liven up the proceedings, by signing up a few of the San Francisco 49er cheerleaders to wake up attendees before an 8:30 AM keynote address, on the second day of the conference. The cheerleaders, who regularly appear before crowds (including many families) of 70,000 fans at every 49er home game, also are known for their charitable work, and for their careers and education beyond the football field. Nevertheless, according to sources who would only speak off the record, when a female EDA blogger launched a personal protest of the cheerleaders, contacting EDAC Board members and DAC organizers, they cancelled the appearance. Attempts to get a statement from the DAC Executive Committee have gone without a response. Gold Rush management has also declined to comment.

After reading this, I felt sure Peggy was behind it but could not confirm and Paul McLellan was not talking. Paul is the official DAC webmaster so I understand his tight lips. I also understand the decision by the DAC people to cancel to avoid controversy.

Next comes John Cooley’s article:

Peggy bans 49er cheerleaders, Gabe wants Denali party cancelled

I do read John, don’t always agree with him, but certainly respect the work he has done on DeepChip:

I can’t believe this. The DAC Executive Committee caved into to the angry feminazi rants of Granny Peggy Aycinena????? WTF? Just because Peggy wouldn’t have appreciated these cheerleaders, a good 90% of the heterosexual male population at this DAC would have! WTF???

Okay, John is being crude here but I agree with his point. I don’t like a moral majority of one person making decisions on what is and is not appropriate for an entire crowd.I also don’t appreciate the negative label Peggy attaches to the 49er Cheerleaders. They are athletes, goodwill ambassadors, and they deserve better(I bold this because it is the main point of this blog).

Gabe Moretti
also did an article on this (according to John Cooley) but I don’t read his site Gabe on EDA and it did not come up on Google. Maybe he thought better than to get on Peggy’s bad boy list and deleted it. Or maybe Gabe’s site is not search engine friendly. Probably both.

Peggy’s response to all this did come up on Google to which I’m reading for the first time:

Cooley: Ignore the men behind the curtain by Peggy Aycinena

This rant is so fractured I don’t even know what to cut and paste so you will just have to read it yourself. She goes “eye for an eye” with John attacking him personally and with increased venom. Included is a list of people she has pissed off and I’m on it and this is why.

Last year Paul McLellan did an article on SemiWiki: Semiconductor Virtual Model Platforms which included a picture of a female model (nothing racy). Peggy posted a rant against Paul, me, and SemiWiki so we changed the pic to what you see today. That rant was also removed after Paul bought her lunch to smooth things over. Even better, Peggy once called some of the DAC hostesses (booth babes) prostitutes. That article was removed as have most of her other rants. This latest one will probably be removed so I saved a copy just in case because it really is quite funny in a disturbing sort of way.

This was my 29[SUP]th[/SUP] DAC so I have seen the evolution first hand. In fact, I was pleasantly surprised when DAC allowed alcohol on the show floor, which apparently Peggy is okay with, for now anyway. My opinion: We are adults and can make personal choices as we see fit. It would have been nice to have been allowed the choice of attending the 49er Cheerleader DAC session or not. Next year hopefully an actual majority will prevail and we will see Cheerleaders serving beer!


Testing ARM Cores – Mentor and ARM Lunch Seminar

Testing ARM Cores – Mentor and ARM Lunch Seminar
by Beth Martin on 07-08-2012 at 8:29 pm

If you are involved in testing memory or logic of ARM-based designs, you’ll want to attend this free seminar on July 17, 2012 in Santa Clara. Mentor Graphics and ARM have a long standing partnership, and have optimized the Mentor test products (a.k.a Tessent) for the ARM processors and memory IP.

The lunch seminar runs from 10:30-1:00 at the Santa Clara Marriott. The presenters are Richard Slobodnik of ARM, and Stephen Pateras of Mentor Graphics. They will describe the specific test solutions developed to cover memory and logic test for ARM-based designs. A newer feature is the shared bus interface where MemoryBIST controllers reside outside of the ARM core, and use the shared bus to test the memory inside the core. Blocks with a shared bus and with memories on the bus (memory clusters) have a functional interface to the bus (see the figure).

Sign up for this free ARM / Mentor Graphics Lunch Seminar now.

If you want to study up before, here are two relevant whitepapers from Mentor:
Memory Test and Repair Solution for ARM Processor Cores
High Quality Test of ARM® Cortex™-A15 Processor Using Tessent® TestKompress®


NVM IP: Novocell Semiconductor has announced an expansion of their product line

NVM IP: Novocell Semiconductor has announced an expansion of their product line
by Eric Esteve on 07-08-2012 at 3:52 am

Novocell Semiconductor, core antifuse-based OTP Smartbit™ technology was first patented in 2001 and 2002, and created a solid foundation for the first ten years,” stated Walt Novosel, President and CTO, “Since then, our customer-driven focus has led to numerous innovations in our original high reliability Smartbit-based NVM IP to best service specific system on chip (SoC) market segments. Our announcement today unveils our full line of NVM products to fully serve our customers’ needs, from 8bit register OTP, to specialty trimming and calibration OTP, to 4Mbit ultra-high density code storage and configuration OTP, to 1000x multi-time write hybrid OTP/MTP.”
Continue reading “NVM IP: Novocell Semiconductor has announced an expansion of their product line”


Intel Goes Vertical to Guarantee PC Growth

Intel Goes Vertical to Guarantee PC Growth
by Ed McKernan on 07-07-2012 at 8:30 pm

A Bloomberg article from early July caught my eye as it portends further changes in the competitive mobile market landscape. Intel is now in the business of paying Taiwanese panel suppliers to ensure the supply of touch-screen panels for PC ultrabooks. In essence it says that to win in the PC market, Intel has to mimic Apple and go more and more vertical in the supply chain. Apple’s stellar growth makes it difficult for PC manufacturers to forecast true demand out 3 to 6 months and given their minuscule profit margins they have to veer towards the conservative or face the risk of going out of business with excess inventory. Intel, like Microsoft is faced with having to control its destiny vs. the laissez faire Wintel model that has existed for 30 years.

In a previous blog, I mentioned how Microsoft may have started a Thermonuclear War with its customers (e.g. HP and Dell) when it introduced its Windows 8 Tablet – or should we say pre-announcement of Win 8 Tablets. Microsoft and Intel are showing signs that the combined profits that they derive from the PC market are too high for their customers to price at a suitable discount against the growing Apple Empire. Apple buries its O/S cost and in the case of iPADs and iPhones its CPU cost. These costs are lower than what Microsoft and Intel charge their PC OEMs and neither one wants to give in as Tablets and Ultrabooks rollout this fall. Given the strength of the iPAD growth and the now almost assured rollout of a smaller iPAD in September at $299, OEMs are concerned about what the true demand is for PCs, especially in the US and Europe.

Furthermore, Intel is in a short-term mode of keeping Ivy Bridge ULV prices high in order to force OEMs to abandon the idea of including an nVidia or AMD graphics chip in Ultrabooks because the additional cost pushes system prices out of range of what the market will pay. I expect Intel, however, to drive prices lower to capture the market in Q4 before AMD responds with a competitive solution. Currently the lowest cost Intel ULV part is over $100, which is way to high if Ultrabooks are to reach the $499 price point for high volume consumers. Over the long term though, Microsoft and Intel face unique challenges due to Apple’s growth. Both rely heavily on corporate and government purchases of PCs. Microsoft is threatened with the immediate prospect that Apple will make inroads with MacBook notebook PCs and iPADs. It is a direct hit on Microsoft’s O/S and Office Revenue stream. Microsoft has to have an immediate answer this Fall with a Windows 8 tablet, but it appears that HP and Dell can not deliver on a price that is below Apple’s iPAD.

Microsoft needs to step in and plug the hole with what will effectively be a discount on its software stack. Essentially give away the hardware to sell the Software (i.e. razor – razore blade model we are all accustomed to). Intel has a different scenario playing out and appears to be in a stronger position. In the short run it is executing to a plan that calls for cannibalizing AMD and nVidia ($10B+ revenue) with the ultrabook platform, even while PC growth slows at the expense of iPADs. The investment in Taiwan panel manufacturers will likely come with an exclusivity that bars AMD and nVidia silicon from showing up in the end product. From mid 2013 onward, Intel has to win Apple’s business as it attempts to force the whole mobile market to the leading edge process node.

Qualcomm’s misread on demand for its 28nm 4G solutions is a significant sign that the industry based its smartphone and tablet business models on an (n-1) process technology instead of being out over the ski tips. By (n-1) process, I am speaking of how many semiconductor suppliers were counting on 40nm being the volume process for mobiles this summer and fall and 28nm being a 2013 volume driver. Longer term, when Intel gets its baseband capabilities closer to Qualcomm’s, the leading edge will be determined by Intel’s latest process.Intel’s PC business model from the 1990s through today has been all about delivering processors on the leading edge. The trek they are taking to 14nm with mobile processors and Atom’s with a robust communications platform speaks to the opportunity to cannibalize Qualcomm and Broadcom. However, en route to this scenario it is looking like they will need to take a greater role in propping up the PC system supply chain.

FULL DISCLOSURE: I am Long AAPL, INTC, QCOM, ALTR