RVN! 26 Banner revised (800 x 100 px) (600 x 100 px)

Blogging for Consultants

Blogging for Consultants
by Daniel Payne on 03-05-2015 at 9:00 pm

Paul McLellan wrote about how he stumbled into blogging and it inspired me to share my story as well. I grew up in Minnesota and attended the U of Minnesota earning a bachelor’s degree in Electrical Engineering so that I could design computer chips. After interviewing in 1978 with HP, IBM, Intel and Motorola I decided to join Intel in Oregon and design DRAM chips. It was amazing how little design automation there was, and correspondingly how much grunt work and manual design was required to design NMOS circuits at the transistor level. I kept asking management, “Where is the software to automate DRC, LVS and other tasks?”

Their reply was, “We hired you to manually do that, so get back to work.”

Clearly the short-term thinking was at operation here, and after 8 years of full-custom, transistor-level IC design work, I joined my first EDA company, Silicon Compilers in 1986. The president was Phil Kaufman, and he was another ex-Intel guy that knew how to run a company. At SCI I learned all about being an Applications Engineers, AE Manager, Technical Marketing Engineer and Product Marketing Manager. Along the way SCI acquired many companies, and then got acquired by Mentor Graphics.

I stayed at EDA companies until 2004 helping teams with a wide range of EDA tools (SPICE to HLS), and then became a freelance EDA consultant, offering technical and product marketing services. My networking on LinkedIn and keeping in touch with former co-workers continued to grow my consulting business. I discovered a monthly networking group that had lunch in the Portland, Oregon area and there met John Blyler.

In 2008 John started talking about his blogging at Chip Design Magazine, and asked if I would consider blogging. It sounded fun, and I knew that it would raise my consultant profile to prospective EDA companies, so I blogged for the next three years at Chip Design Magazine. I always wondered if anyone was reading my blogs and then one year I attended a DAC conference and was on the escalator when two engineers from downstairs pointed up at me and shouted, “Hey, aren’t you Daniel Payne, the blogger?”

Wow, who knew that bloggers could be popular?

At the 2011 DAC I met Daniel Nenni and he soon asked me to blog for his start-up site, SemiWiki. I had been reading Dan’s blogs and heard his vision for crowd-sourcing, and being an open platform for discussion on all things semiconductor, IP and EDA. It sounded like a novel concept, and there was a revenue model, so I accepted and began blogging at SemiWiki.

Every morning I read each new blog article to continue my education about our industry and keep up to date. Recently we’ve also begun focusing on the Forums, I’m mostly over in the EDA Software forum. I also view what’s happening on LinkedIn, review my RSS feed of interesting content, and search for tweets of interest by using a filter:
#SemiEDA OR #SemiIP OR #52DAC OR FinFET


At first on Twitter we were using #EDA but quickly found out the EDA also stands for Economic Development Agency, Electronic Directory Assistance, and is a popular search phrase in Japan and Brazil, all not related to Electronic Design Automation. Now we’ve been encouraging Twitter users to use #SemiEDA to be more focused on our industry, along with #SemiIP. So I’m spending about an hour per day reading and learning about the ever-changing news. Follow me @Daniel_J_Payne

One of the most visited Wiki pages is the one showing every EDA merger and acquisition since our industry began, thanks mostly to the efforts of Ian Getreu, another consultant in the Portland area.

Blogging at SemiWiki is a wonderful way for consultants to contribute to our industry, make new contacts and get new clients. I’ve enjoyed blogging about over a dozen different companies in the EDA and IP space, and with SemiWiki I learn something new every day.

Also Read: CDN is Live in Silicon Valley!

When I’m not blogging or consulting in EDA, you can find me out riding a road bike for fitness and a bit of competition by posting on Strava.com. If you ever visit the Portland, Oregon area, then look me up and I’ll meet you for a cup of coffee.



Intel NOT Inside Sematech?

Intel NOT Inside Sematech?
by Robert Maire on 03-05-2015 at 1:00 pm

Rumor that Intel has quit Sematech confirmed by website?
Potential industry impact-
Is this a consolidation by-product?
Do we need a trade group?
Does it benefit Intel?

We have heard from several sources over the last few days that Intel has quit Sematech, the semiconductor industry trade group focused on technology development and advancement. This does not appear to be confirmed by any press announcement from either party and could easily be wrong but Intel is noticably absent from the membership roster on the Sematech website where it had previously been listed . We are dubious that such a glaring mistake could have been made by accident.

Sematech history…
Sematech was formed as a trade group focused on furthering technology and promoting and protecting the US semiconductor industry. There is a similar trade group in Europe IMEC performing a similar function for European companies. If the measure of success was protecting the US semiconductor industry we would probably give it a failing grade based upon the current state of the US semiconductor industry as compared to when Sematech was founded. Its obvious that most of the industry has moved overseas.

Limited technology successes…
There has probably been more success on the technology front however the vast majority of innovation is done by semiconductor and semiconductor equipment companies on their own. Sematech does fund and oversee a number of technology initiatives that have no singular home or main sponsor in the industry or don’t have the proper ROI that participants require. It could be also viewed as a central R&D function or a replacement of the early version of bell labs that did basic pioneering research

Has consolidation eliminated the need?
When the semiconductor industry was fragmented with 50-100 chip companies, pooling R&D resources makes a lot of sense as no single company could support many of the long range projects and there was communal benefit. Now with 3 or 4 behemoths in the industry each of which can drop hundreds of millions on a project there may be less motivation to pool resources.

With the consolidation of the equipment industry each of the surviving players is so big they too have the resources to do R&D on their own and clearly don’t want to share any breakthroughs. The only project that has seen significant pooled resources was ASML’s passing the hat around for EUV development which was done outside of the auspices of Sematech anyway.

Intel passing the baton to Asia?
Intel has been the standard bearer, flag carrying advocate of Sematech as well its leading financial supporter as dues are based on financial model. Given that Intel has now fallen to number three in capital spending behind TSMC and Samsung the role seems to make less sense. It was unclear to us if Intel got an appropriate return on its Sematech investment and as Intel is watching its expenditures ever more closely they may have done that calculus and decided the resources could better be invested elsewhere (like buying into the Chinese market)

Can Sematech survive?
In our view its unclear if Sematech can survive the loss of Intel or if it will start a rush for the exit doors. IMEC could see a similar fate. Maybe Sematech could move to Asia?

What does it mean to equipment companies?

This is clearly a positive for the big four equipment companies, ASML Eteris, LRCX and KLAC and a negative for smaller equipment companies. Likely a negative for EUV as Sematech has been a supporter. Probably neutral to most semiconductor companies and likely a positive for Intel.

Robert Maire
Semiconductor Advisors LLC

Also Read:CDN is Live in Silicon Valley!


Blogging for Dummies

Blogging for Dummies
by Paul McLellan on 03-05-2015 at 7:00 am

I am often asked how I became a blogger (or a journalist if you want to make it sound more professional). I think people assume that I planned it in some way but I never did. Life is what happens while you are making other plans. To see how unlikely it is, you need to know a bit of my background.

I have a PhD in computer science so I’m actually a total geek, not at all the obvious qualification for being a writer (although all PhDs have at least managed to write one extensive document, their thesis). I started my career as a programmer and then moved into management.

In a roundabout way I ended up as CEO of Compass for just under the last year of its existence. My big claim to fame was that having had 5 quarters of sequentially declining revenue I managed to produce 3 quarters of sequentially increasing revenue, which was enough to put together a roadshow and we ended up selling Compass to Avant!

I then went and ran engineering at Ambit and, when it was acquired by Cadence, I moved into marketing. I discovered that I was an unusual mixture, very technical but good at writing, and creating and giving presentations.

After some stints in system companies, I was a marketing consultant. One of my gigs was working a couple of days a week for a power-reduction EDA company called Envis. One day the board fired the CEO and they asked me to run the company, so I got my second CEO gig. The technology turned out not to be much good and I told the investors they should wind it up, but that didn’t fit their plans so I helped them bring in a new CEO (and he flew it into the ground).

But it was the start of 2008, the downturn was in full swing. There was no consulting business to be had. Big companies terminate all the consultants before they start layoffs. Startups terminate all their consultants since they realize they will not get any more cash for a long time. For over 6 months I was literally on unemployment, collecting my $1800/month from the state.

In the meantime I talked to Ron Wilson (then) at EDN and agreed to blog unpaid for them. I started the EDAgraffiti blog and for the best part of a year I produced a blog every weekday on EDA or something related. It turned out that being able to write reasonably well and having a strong technical background is a good combination. Also, a fairly rare one: there are many good writers and many good technologists but not many who are competent at both. The best of those blogs got put together in the EDAgraffiti book, that Wally Rhines told me was “the best book on EDA” although I pointed out that since it was the only book on EDA that was a pretty low bar. It is still available.


Dan Nenni was starting SemiWiki at that point. He asked me if I wanted to join and I agreed. One of the perks turned out to be that I am truly “press” and so get free passes to pretty much any conference/symposium I want to attend, provided I write about it, which makes it fairly easy to keep up with the industry. I still put out a blog pretty much every day, just like I did in EDAgraffiti days (actually more since I usually do one on Sunday for the weekend newletter too).

Also Read: CDN is Live in Silicon Valley!

So that is how I became a blogger/journalist. I cover everything from embedded software at the high level all the way down to lithography and process, along with everything in between. FinFETs, SADP, TSV, EUV, DSA, SMP, FPGA, mobile, mP, 10nm, HLS, UVM, RET, SP&R, DRC, iOS, FD-SOI. Find an acronym and I’ve probably covered it!

Buy the books: EDAgraffitior Fabless. Or read the blog…wait, you already are!


Three Colorful Bytes from the NXP History

Three Colorful Bytes from the NXP History
by Majeed Ahmad on 03-04-2015 at 7:00 pm

The proposed merger of NXP and Freescale, which creates a bigger semiconductor outfit, also brings forth some fascinating history bytes from the technology heritage that these two spin-offs carry from their respective corporate parents. In 2006, Philips Electronics sold its chip business division Philips Semiconductors to a consortium of private equity investors. The name NXP stood for the consumer’s “next experience.”

Likewise, Motorola Inc. made the Motorola Semiconductor Products Sector autonomous in 2004 and renamed the new silicon-focused outfit as Freescale Semiconductor. This blog traces some parts of the NXP heritage that spans over the past four decades.

NXP: A Fabless Model Pioneer

The Semiwiki Forumuser hist78 has chronicled how TSMC’s Morris Chang found a small audience among semiconductor companies for his revolutionary idea of a pure-play fab back in the mid-1980s. Intel, TI, and Philips gave him a chance to make a presentation, and eventually, both Intel and TI said no.


Philips’ early investment and technology transfer were vital in TSMC success

It was Philips Electronics that agreed to invest and do the technology-transfer to help jumpstart TSMC while owning a 28 percent stake of TSMC during its formative years. Later on, Philips gradually sold all of its shares in TSMC with huge profit, but that’s another story. In retrospect, it was Philips decision to invest in TSMC during the mid-1980s that kick-started the fabless revolution, which in turn, changed the semiconductor landscape forever.

Buy VLSI, Buy SoC

San Jose, California–based VLSI Technology was a pioneer in ASIC, SoC and semiconductor process technologies. It became an early vendor of standard cell ASICs during the early 1980s and dominated the PC chipsets business in the next decade. In 1999, Philips Semiconductors—the precursor of NXP Semiconductors—made a hostile bid for VLSI Technology and eventually acquired the ASIC pioneer for around a billion dollars.


VLSI was an ASIC and SoC pioneer

Apparently, Philips faced difficulties in custom designs quickly moving to new process generations, an area where VLSI excelled with its broad array of chip design libraries and tools. Moreover, the purchase seemed to be stimulated by Philips’ growth and success in the mobile handset chips business. The VLSI buy went a long way for the Dutch company in the unfolding SoC era that followed in the years after this acquisition.

NFC: First Invent, Then Rescue

NXP is a pioneer in near-field communication (NFC) technology; its parent company Philips Electronics developed and launched the contactless access technology in collaboration with Sony back in the early 2000s. The NFC technology was originally developed for the transport and convenience store segments in large Asian cities like Hong Kong and Tokyo. The NFC-based Octopus Card for Hong Kong’s subway service has been a smashing success.


Hong Kong’s Octopus Card: An Early NFC Success Story
(Image: MLP Forums)

But it was tap-to-pay mobile service where NFC was going to make it big. Mobile commerce advocates said that cash would become a thing of the past and that the future of digital money was inside the NFC chip residing in smartphones. However, the promise of mobile payments remained in doldrums until 2014, when Apple and NXP joined hands to develop the first viable tap-to-pay service on the iPhone 6.

Also Read: CDN is Live in Silicon Valley!

Apple Pay—a hugely successful mobile payment service that provided a seal of approval to the NFC technology—used an NXP SoC device that combined the Secure Element (SE) microcontroller with an NFC radio. The SE-centric hardware in the iPhone 6 allowed over-the-air provisioning by the banks and credit card companies and kept mobile operators out of the payment ecosystem.

Majeed Ahmad is author of books Smartphone: Mobile Revolution at the Crossroads of Communications, Computing and Consumer Electronicsand Mobile Commerce 2.0: Where Payments, Location and Advertising Converge.


Synflow and Cx

Synflow and Cx
by Paul McLellan on 03-04-2015 at 9:00 am

When hardware designers hear about a new language their heart sinks. We already have Verilog, SystemVerilog and VHDL. And if you go up a level, we have C, C++ and SystemC. Isn’t that enough? However, if you tell a software engineer about a new language they are interested, there are hundreds of programming language and hundreds of thousands of “little languages” that a particular piece of software reads and executes. In SoC design, it turns out that moving up a level from RTL generally does not mean going to one of the C-like languages but doing IP assembly instead.

But there is a paradigm shift going on in design, especially for FPGA but in ASIC too. When the era of schematic-based design came to an end to be replaced with RTL Synthesis, by and large it was not the guys who had used schematics all their lives who suddenly learned RTL. They had enough of a problem with their schematics on a screen rather than paper. Instead, new young Turks who had learned Verilog in school drove the transition. In the same way, software engineers want to be able to design chips with all the hardware stuff taken care of by the tools. In fact this is a trend. Whereas in the past hardware guys didn’t try and write embedded software, and embedded software guys didn’t do hardware, increasingly there are engineers that do both hardware and software design. A survey by EEtimes a couple of years ago shows the figures clearly.


But there is a big problem, the languages the hardware guys use are no good for software engineers. The RTL level languages are too close to the hardware. The C level languages either are not inherently concurrent or get concurrency by unwieldy libraries. They are just unnatural from anyone outside the EDA microcosm. Neither RTL nor C/C++ are really acceptable to a software engineer compared to creating a new language.

Synflow is a company based in France that has done just that. They have created a language Cx targeted at bringing hardware design within reach of any software developer.

The goals of the language are:

  • C-like syntax and structures
  • Cycle-accurate behavior
  • Strong bit-accurate typing
  • Fast learning curve
  • Design and verification

Of course a language on its own is not much use. It needs a development environment and, for code that is ultimately going to be implemented in hardware, the code generation (RTL generation) needs to be really first rate. Ease of use is never a substitute for optimal design. The development environment, ngDesign, contains a lot of technology:

  • Cx editor
  • FSM view and design view
  • Git version control system
  • Project manager
  • Sanity checkers
  • Clock domain crossing (CDC) checkers
  • HDL code-generators
  • Simulator

The various sanity checks, such as CDC checking or unconnected port checking, are done on the fly as the code is created. The output HDL is truly portable and can be used with any tool flow (EDA or FPGA). It is vendor-neutral, readable VHDL and Verilog that honors the usual coding style rules. What does the language look like? Well, you can’t tell much about a language by just looking at it rather than using it, but like this:


Recently Synflow put some information about Cx up on Hacker News (if you are not a software engineer you have probably never heard of it, but it is run by Y-combinator that you probably have heard of). The hits on their website went up by a couple of orders of magnitude from a handful a day to hundreds. As I said earlier, software engineers are open to a new powerful language in a way that hardware design engineers are not. They know that learning a new language is trivial and if it helps them get a job done better than another language then that is the way to go.

So is anybody using it? They have users in all parts of the world: USA, Europe, Canada, Japan, China. They are doing all sorts of designs, both control and datapath oriented, and getting results comparable from hand-crafted RTL.

Here is a comment from an engineer at one of the biggest semiconductor companies who has been evaluating Cx:I am a strong believer in constrained language definition over the use of general
purpose languages with target libraries.

The tools are are available for evaluation from the company’s website and are now shipping to paying customers.

Synflow’s website is here.

Also Read: CDN is Live in Silicon Valley!


CDN is Live in Silicon Valley!

CDN is Live in Silicon Valley!
by Daniel Nenni on 03-03-2015 at 10:00 pm

As big of a fan as I am of Social Media there is still nothing like getting up close and personal when collaborating with the fabless semiconductor ecosystem. After 30+ years in Silicon Valley if there is one thing I have learned it’s that “showing up” is the #1 key to success, absolutely.

Speaking of showing up, each year there are three big user group meetings here in Silicon Valley. The first one is CDNLive(Cadence, March 10-11), then SNUG(Synopsys, March 23-24), and User2User (Mentor, April 21st). Thousands of people will attend, executives will keynote, fellow users will present, marketeers will dance, and of course there will be a free lunch and parting gifts. Not a bad way to spend a day. The best part of course is meeting fellow semiconductor professionals and networking. Remember, it’s not so much what you know but who you know in this ultra-fast-paced world changing industry. Where would we be without this level of collaboration? Still using flip phones…

The best parts for me are the customer presentations which is the meat of any conference. For CDNLive it is: Cisco, TI, FreeScale, Broadcom, AMD, PMC-Sierra, Broadcom, Oracle, ARM, Xilinx, Samsung, SanDisk, Altera, and a whole host of others. Several of the SemiWiki bloggers will be there blogging live and you can even use the promo code “SemiWiki” for $50 off, such a deal.

What’s Happening at CDNLive Silicon Valley 2015

Each year, CDNLive Silicon Valley brings together a record number of Cadence® technology users, developers, and industry experts to network, share best practices on critical design and verification issues, and discover new techniques for designing advanced silicon, SoCs, and systems.

Papers: Choose from a wide variety of user-authored papers addressing all aspects of design and IP creation, integration, and verification. Discover how others are using Cadence technologies and techniques to realize silicon, SoCs, and systems—efficiently and profitably.

Techtorials: Participate in a variety of interactive techtorials to get a more in-depth look at specific Cadence products, new solutions, and feature enhancements.

Keynote speakers: Hear from industry leaders who influence the global electronics marketplace. They will discuss industry trends in silicon, SoC, and system realization and share their thoughts on the most pressing design challenges.

Designer Expo:
Learn more about the collaborative ecosystem available to support you. Cadence and our partners will showcase the latest results of our joint efforts. Explore new products and services from our many exhibitors.

Networking opportunities:
Engage in stimulating technology discussions with your peers and stay connected after the conference.

Join us at CDNLive Silicon Valley 2015!
March 10-11, 2015
Santa Clara Convention Center


Where a New IP Company Could Invest

Where a New IP Company Could Invest
by barun on 03-03-2015 at 7:00 pm

Which IP which will give substantial return in next 3 – 5 years and where a company will invest as well as how it will differentiate itself from others, particularly the big ones, are key discussion topics for any start-up entering into semiconductor IP business.

In the last few decades we have seen a tremendous growth of interface IPs due to the standardization of chip-to-chip or system-to-system communication process. Lot of IP companies have heavily invested in developing interface IPs and get benefit of those investments by providing license of those IPs to several SoC companies.

But the market is slowly getting saturated and entry barrier is increasing to the new start-ups. One of the reason is not many new protocols are coming out. In most of the cases either it is upgrading of existing protocol (like PCI-E, USB etc) or fusion of two existing protocols (like M-PCIE). An existing IP vendor will always incur much less investment to upgrade their IP than a company who is completely new in that domain. Also customers, particularly, who has licensed the IP of old version will most likely go with the existing vendor for IP based on recent versions. Moreover these IPs are very much standardized in nature, needs less customization and hence a big company, with much larger sales network, can get more benefit by investing in those IPs. That is why we have seen two big IP vendors Synopsys and Cadence are focused on interface IP segments and have even acquired small IP companies in those domains.

The question will then come where a new IP company can invest. The old traditional IPs like data converter, power management, clocking circuit modules with a focus on application areas like automotive, medical, industrial can be potential areas for new IP companies. Let us look why

  • Automotive, medical and industrial SoCs have seen 7% to 9% growth which is way above than semiconductor average growth (some reports have even predicted 10%+ growth in automotive IC area). Also these segments commands more than 40% market share in analog IC market
  • All of these sectors have significant usage of analog IPs. As per several reports analog IC constitutes more than 40% share of total semiconductor devices sold in those segment
  • The IPs in data converter, power management etc areas need significant customization in every SoC depending on applications environment likes power, environmental noise, operative voltage etc. A start-up can take those customization requirements more effectively than the big guys. If we look none of the big IP companies actually have significant offerings in these areas
  • In significant cases customers request complete SoC design with some embedded software development with these IPs. The reason is in good amount of cases the IPs with some peripheral logic form a SoC itself. As a result a start-up, with more adoption capability, can provide a solution comprising of both IP (product) and services
  • Also OEMs, sometimes, directly becomes customers for these kind of project and in those situation even post GDSII manufacturing is involved

But there are significant challenges also to develop and market IPs targeted for specific application segments

  • The most important is it needs significant amount of domain specific knowledge like application environment, certification/ compliances etc
  • Typically the areas like automotive, medical, industrial have more stringent requirement in terms of certification, qualification and hence the chance of success of IP in the first tape-out becomes low. Hence the time of get benefit of the investment also becomes longer
  • The processes used in these areas are not always normal CMOS process. Bipolar, BiCMOS, HV-CMOS processes etc are used significantly in these areas. Hence besides design knowledge the process knowledge also becomes a critical to develop the right IP in these areas
  • The SoC used in automotive, medical, industrial application has 10+ years of lifetime and hence customers expect longtime support from IP vendor. They may be reluctant to rely on a start-up for this

Regards,
Barun Kumar De


Exensio: Big Data in the Fab

Exensio: Big Data in the Fab
by Paul McLellan on 03-03-2015 at 7:00 am

For 20 years PDF Solutions have been working with fabs on yield enhancement. Today, they announced their Exensio Platform for big data manufacturing environments. They haven’t really been keeping it a secret and have been talking about it at events since late last year, but it has basically been in stealth mode for the last 3 years. The primary focus of Exensio is to help fabs ramp to production volumes and then keep yield up once they are there.

There are 700-800 steps in a typical production process today. However, inspection is only done every 40 or 50 processing steps. This makes it hard to identify exactly what a problem is caused by. What is really wanted is to catch the problem when it happens and identify the tool (or material) causing the issue, a huge Fault Detection and Classification (FDC) system. But there are tens of millions of datapoints per second and so to do this would require hooking up every tool in the fab to a huge central database, capturing data within milliseconds so as to be able to catch even small excursions, uploading terabytes of data every day, and analyzing it all in real-time. Ideally the system would be able to react before the wafer was even finished processing.

And that is just what Exensio does. For example, every 300mm tool at TSMC is hooked up to Extensio. They started with beta customers such as Sony four years ago in 2011.


I talked last week to John Kibarian, CEO of PDF, about the Exensio announcement. He said that a lot of this is being driven by the explosive growth in data availability across the manufacturing flow (not just in the fab but in packaging, assembly and test too). There are more types of data collected, including data that has not really been used before. Larger factories and larger wafers generate more data. Plus there is a need for fast decisions so that corrective action can be taken immediately, as well as the slower analysis of longer term trends such as the day of the week, the specific operator, the specific tool and so on.

Most customers can and do conduct some of this analysis themselves. But typically this takes days or weeks to complete as opposed to minutes or hours. At some level, it is all about driving variability reduction which is the key to ramping a process to high volume manufacturing (HVM) and gradually driving yields even higher once that is achieved. With yield now sometimes driven by single layer atomic variation, the analysis can be very complex.


Exensio is built on top of the Cassandra database for scalability (as, by the way, is Facebook who have perhaps the biggest scalability problem of anybody). One top of this are:

  • Exensio-yield (dataPOWER)
  • Exensio-control (maestra)
  • Exensio Yield Analysis Services


PDF solutions have more process yield ramps to HVM than any other company, with over 60 below 90nm. They are the established leader with connections to all the major foundries such as TSMC, GF, ST, On, TowerJazz and more, along with relationships with the major fabless companies such as Qualcomm, NXP, Silicon Image, CSR and many more who work closely with their foundries as “virtual IDMs”. Exensio is the next step in the evolution of PDF, successfully leveraging big data architecture and technology. 80% of their business is leading edge with 28/22/20nm now just a niche and most work going on at 16/14/10nm.

PDF Solutions website is here.


All things being unequal for NXP and Freescale

All things being unequal for NXP and Freescale
by Don Dingee on 03-02-2015 at 4:00 pm

When I read the news that NXP was buying Freescale, it felt like a part of me – and a big part of the history of high tech industry in Arizona – died. There was a time not that long ago where Motorola was the biggest employer in this state, way before Freescale and ON Semi separated from the mothership. Somehow, even with moving headquarters to Austin and downsizing and changes in leadership, Freescale still felt like an old friend who lived just down the street. Continue reading “All things being unequal for NXP and Freescale”


Simple Analog ASIC Solves Thermal Analysis Problems

Simple Analog ASIC Solves Thermal Analysis Problems
by admin on 03-02-2015 at 1:00 pm

In a world where Application Specific Integrated Circuits (ASICs) and Application Specific Standard Products (ASSPs) are dominating every conceivable application, greater attention is being applied to their long term reliability. These chips are being built on smaller lithographies, running at higher speeds, dissipating more power and to make things worse, they are being encapsulated in ever decreasing package sizes.

Higher device performance comes at a price; higher temperatures. And with higher temperatures comes lower reliability if thermal considerations aren’t carefully controlled. Semiconductor manufacturers have long been aware of the problems associated with heat. Most have application notes and white papers plastered across their web sites espousing the benefits of careful calculation of power management using their values of [SUB]JA[/SUB] and [SUB]JC[/SUB] (Junction-to-Ambient and Junction-to-Case thermal resistance, respectively) often with sidebars suggesting various heat sinks to use in marginal situations. This puts the burden of solving temperature related problems on the backs of the user.

Recent technology advances and the proliferation of the use of Thermal Test Chips (TTCs) like those developed by JVD, Inc. for Thermal Engineering Associates of Santa Clara, CA is allowing semiconductor manufacturers and companies designing their own ASIC/ASSP devices to get ahead of the curve by thermally engineering their silicon before going to production. These TTCs allow system designers to fully model, measure and modify their designs before committing to costly silicon. They are special Analog ASIC that are used to model and measure the thermal performance of your chip design in situ before you commit those tooling dollars for masks and wafers. , the use of these Analog ASIC Thermal Test Chips play an important role in allowing semiconductor manufacturers and companies designing their own ASIC/ASSP devices to get ahead of the curve by thermally engineering their silicon before going to production.

Modeling allows you to create multiple individual heat sources on the TTC die, identical to the heat sources that will occur on your final IC. Temperature sensors, strategically located throughout the TTC give you precise measurement of the temperature of the die at multiple locations simultaneously. The heat sources can be modulated to replicate various portions of your IC being power on, off or in an intermediate mode. By tracking the absolute or changes in temperature at any point on the TTC, you can determine if one or more heat sources combine to exceed safe operating temperatures of the intended IC design. If temperatures are problematic, you can go back to your IC design and modify the chip?s layout to isolate the heat sources and alleviate the potential problem.

FULL WHITE PAPER HERE

Bob Frostholm – JVD, Inc.