CEVA Dolphin Weninar SemiWiki 800x100 260419 (1)

My Experience with the Ultra Thin 2015 MacBook

My Experience with the Ultra Thin 2015 MacBook
by Tom Simon on 08-23-2015 at 4:00 pm

Before we left for our 5 week trip to Europe I decided that I would need a real laptop computer on the road. I knew it would be a near necessity for booking hotels and making train reservations. Also, I would need to write emails and maybe even pay some online bills. I already have an iPad but really wanted to be able to run all my applications – Excel, Word, Picasa, etc. In looking for the lightest machine that would do this I came across the new MacBook and bought one.

When people see the new MacBook, they think it is a MacBook Air. Apple surprisingly built the new MacBook in a smaller and lighter configuration than the Air. When I first read about it I had to do a double take. The MacBook is 12 inches, in between the 11 and 13-inch Air, but at 2.03 lbs it weighs less and is thinner than the 11 and the 13-inch Air. It uses the new Intel Core M dual core processor. The processor is made on Intel’s 14nm FinFET process and runs at under 5W. For comparison the i7 and i5 typically run at around 15W. The base clock rate is only 1.1 GHz, but it can burst up to 2.4 GHz, putting it in a respectable performance range when needed. It also supports Intel’s Hyperthreading which helps to somewhat further optimize processor utilization.

Unlike the MacBook Air, it comes with a Retina display. The big advantage for me is that I almost always require a second display connected to my laptop to be productive. But with the Retina I can size two ‘full width’ windows side by side and work more efficiently. This means I can really be mobile and productive. Of course this is not a full on production machine like the MacBook Pro, but I really mostly use it for when I am on the road, or maybe when I want to do my writing or check email on my front porch.

The big departure for the MacBook is its choice of a single USB Type-C connector. There is no separate power socket or network port – just the one USB Type-C. (There is a headphone jack too.) The USB Type-C port is used for charging, driving external displays and/or connecting to external USB devices. It supports USB 3.1 at 5 Gb/s. To access the video output via HDMI or to connect traditional USB devices using the more common Type-A connector Apple sells a dongle, but it costs almost $80. Even this does not get you an Ethernet port.

USB Type-C is becoming more common these days. A quick scan on Amazon shows a multitude of devices for it. Surprisingly Apple has left off Thunderbolt. One report attributed this to the power overhead of Thunderbolt. It would grow the main board size and power requirements significantly. Traveling with the MacBook I learned that you can actually charge its battery using a USB battery pack or car USB adapter if needed – a blessing for mobile use. For charging it comes with a 29W wall adapter and a separate standard reversible USB Type-C cable. Those of you who have had to throw away a perfectly good adapter because of a frayed cable will appreciate this.

Apple is betting on 802.11ac for data transfers in and out. Several months ago I bought the Airport Extreme base station with 802.11ac, and it is fast and reliable. So far I have not felt the need to plug a network cable into the MacBook. I have even done a backup of the MacBook to a drive on a networked Mac Mini using the Time Machine app and MacOS Server software.

One caveat is that the MacBook is not up-gradable. The SSD drive and the RAM are soldered in. I have mixed feelings about this. It would be nice to be able to upgrade later. Apple always charges a premium for pre-configured disk and memory, and this leaves buyers with no options. But it comes with 8 GB of RAM so the main choice you need to make besides color is the storage size. I opted for the less expensive 256GB of SSD. This means my large library of pictures stays home, but I have space for lots of Power Point presentations, PDF’s and Word docs. I use DropBox to sync the folders I need so I do not need to think about moving the files I need when I go out.

So what Is the verdict? I’m pretty happy with the MacBook. The SSD means it boots amazingly fast and the overall system performance is excellent. My previous HP laptop has an i7 quad core with a traditional hard rive circa 2011, and for the things I use the MacBook for I have not noticed performance hit. The illuminated keyboard that Mac users love means I can use it in many situations that would have not worked for my old laptop.

Even the track pad works well for me, and I am a die hard mouse user – or at least was. The track pad is pressure sensitive glass with haptic feedback for the click action. This enables more functions in applications and the Finder by pressing harder on the track pad to get a ‘second’ click. In some apps the tack pad allows pressure sensitive drawing.

One of the reasons they made the system non-upgradable was to minimize the main circuit board size. The Core M processor with its integrated Intel HD Graphics 5300 processor have a low profile and small footprint. The bulk of the chassis is filled with batteries. However, this does not buy the user more battery life than say the MacBook Air. This is due to the power hungry Retina display. Without the Core M this machine would have a shorter battery life than the Air.

Older Macs are infamous for getting hot, really hot. The MacBook has a fan-less design, helped by its solid aluminum chassis as a heat sink. It can get warm, but has never been a problem.

For occasional photo editing, writing, posting blogs, web research and spreadsheets the MacBook works excellently. Mind you my home machine has a lot more juice, but I do not really miss it when I’m on the road. The MacBook slides into my backpack pretty easily and it was not hard to schlep across Europe. Overall I am quite happy with the MacBook. I am writing this now on it in a car on I-5 heading up to Mount Shasta, so you can see it is getting a lot of use.


A Complete Simulation Platform for Mobile Systems

A Complete Simulation Platform for Mobile Systems
by Pawan Fangaria on 08-23-2015 at 7:00 am

If we take an insight into the semiconductor industry, we can easily find that mobile systems are the main drivers of this industry. The Smartphone business has remained at the top since a good number of years. Although the Smartphone sales growth has started showing a sign of stagnation, it is still a main contributor with a solid base in the overall semiconductor revenue. What is the other emerging area? I do not need to tell the obvious, the IoT. Now everything needs connectivity among them, and that’s half baked if not connected with a Smartphone. No wonder smart mobile devices seem to be a business everyone wants to get into. And there are enabling companies providing tools to design the best, robust mobile devices.

What are the key components that go into a modern mobile device? There are many – antenna, battery, different types of sensors, modems, MEMS, wireless charging circuitry, processors, memories, camera, imaging, display, wi-fi, GPS, and so on. The device is complex and has stringent requirements of high performance, low power, low TDP (Thermal Design Power), high reliability under different environments, and so on. Designing such devices or components requires various kinds of simulations to estimate and verify various parameters of a design at different levels including chip, package, electronic board, and even mechanical chassis. It feels great when a single company provides a suite of highly capable simulation tools which can provide a complete integrated solution for simulation of designs for mobile devices.

During 52[SUP]nd[/SUP] DAC, ANSYS along with their customers and partners presented solutions for several design requirements which demonstrate their simulation tools’ calibre in providing the right solution for designing high performance, high reliability mobile devices.

It’s great to hear from SMDH, a company in Brazil that by using RTL driven power integrity flow with PowerArtist for their UHF RFID digital baseband design they could reduce the power consumption by 82%.

Similarly ANSYS has state-of-the-art tools for power delivery optimizations from PCB to package and chip level. They provide an integrated solution for the complete system; system-aware chip as well as chip-aware system solutions. The power noise can be accurately predicted at the PCB level as well as on-chip. A guided methodology can be used to reduce the noise as much as possible.

For mobile devices, battery modeling is an interesting proposition. If a battery can be modelled along with the device where it will be used, nothing like it!


ANSYS Simplorer is a multi-domain, multi-technology simulation system that enables simulation of complex power electronic and electrically controlled systems. ANSYS HFSS has a gold-standard full wave EM field solver. It can automatically generate an efficient and accurate mesh. ANSYS CFD provides fluidic flow analysis capability for designing appropriate structure of a device. To extend the battery life and reliability, the power and thermal budgeting can be done through a set of tools. Power regression can be obtained through the use of PowerArtist and power, noise and reliability signoff can be done by the industry standard RedHawk. There is Sentinel-TI for thermal simulation of 3DIC stacked die.


Power and signal integrity are the key requirements for mobile devices. Noise margin need to be evaluated accurately. ANSYS Q3D provides 3D and 2D EM field simulations for electronic structures. The Q3D and HFSS can be used to evaluate the flow of signals. Sentinel-SSO is used for I/O DDR power, noise and timing analysis. RedHawk can be used to provide accurate IR drop analysis.

A variety of simulation tools on ANSYS simulation platform provide a complete solution for simulation driven design of mobile systems. Different aspects take prime roles at different stages of the system design. For example, at the starting micro-architecture stage power budgeting and reduction takes the prime consideration; at the IP validation stage power delivery and model generation is prime; at the SoC integration stage IR drop, reliability and ESD is critical. The system integration deals with power and signal integrity and thermal analysis at the system level. Then the system design needs to have antenna and wireless power transfer circuits integrated.

ANSYS’ multi-physics simulation platform is quite powerful and caters to a wide range of industry segments. It includes a number of tools for electronic, structural and fluidic simulations involving electrical as well as mechanical aspects of a variety of products for various applications including electrical, electronics, automotive, aerospace, consumer, and more.

Pawan Kumar Fangaria
Founder & President at www.fangarias.com


A Paradigm Shift in Microelectronic System Design

A Paradigm Shift in Microelectronic System Design
by limingxiu on 08-23-2015 at 7:00 am

A Paradigm Shift
The word “paradigm” is defined in the dictionary as “a framework containing the basic assumptions, ways of thinking, and methodology that are commonly accepted by members of a scientific community”. In his influential book “The Structure of Scientific Revolutions” published in 1962, Thomas Kuhn uses the term “a paradigm shift” to indicate a change in the basic assumptions (the paradigms) within the ruling theory of science. Today, this term “paradigm shift” is used widely, both in scientific and non-scientific communities, to describe a profound change in a fundamental model or perception of events.

Ever since the clock concept was introduced into microelectronic system design, it was assumed that all the cycles in a clock pulse train have to be equal in length (a rigorous clock signal). One reason that this form of clock signal has dominated microelectronic system design for a long time is that, in the past, the requirement for IC clocking was mostly straightforward. A clock signal with a fixed rate was sufficient for most systems. However, the complexity of future systems changes the game. Low power operation, low electromagnetic radiation, synchronization among networked devices (e.g. Internet of Things), complex data communication schemes, etc., all require a clock signal that is flexible.

Another reason behind the dominance of this style of rigorous clock is that time, which shows its existence and its flow indirectly through the use of a clock pulse train, is not a physical entity that can be controlled and observed directly. Thus, creating flexible clock is an inherently difficult task. It demands effort beyond simply playing with various techniques at circuit level. Philosophically it requires an adjustment, at fundamental level, in our thinking about the way of clocking microelectronic system. The “anomaly” in this case is a new perspective on the concept of clock frequency. In this line of argument, the materials presented in this book induce a paradigm shift in the field of microelectronic system design.

Clock is an enabler for system level innovation
Viewing from high level, there are four fundamental technologies supporting the entire IC design business: processor technology, memory technology, analog/RF technology and clock technology. In the past several decades, a tremendous amount of effort has been spent on the development of the first three technologies. Clock technology falls behind in this race. One of the key reasons for this is that clock technology deals with a special entity: time. Neither is it directly observable nor is it directly controllable. The circuit designer can only play with it indirectly, through voltage and/or current. This lag, however, provides us an opportunity to make significant progress. It is a battleground for new ideas. It is a potential birthplace for great inventions. It is one of the enablers for system level innovation.

What is new on clock? flexibility versus spectrum purity

When the term “flexible clock” is used, it refers to a clock signal: 1) whose frequency can be arbitrarily set; and 2) whose frequency can be changed quickly. Preferably, these two features shall be achieved simultaneously and be available to clock user at a reasonable cost. A rigorous clock has the characteristic of high spectrum purity, which is beneficial to certain applications. There are, however, many more applications where spectrum purity is not of high concern. Instead, a clock signal possessing the capability of small frequency granularity and fast frequency switching is more useful. Therefore, there is a crucial trade-off to be made when an IC design problem is investigated. In the past, clock of high spectrum purity was the undeniable winner. However, for future microelectronic system design, this is not necessarily always the case.

“Jittery” clock is not necessarily a bad thing
The essence of a clock pulse train is to create a series of “moments in flow-of-time” by utilizing the mechanism of “voltage-level-crossing-a-threshold”. The resulting moments are used as the reference points for other events happening inside the microelectronic system. Therefore, the requirement on those moments is that their location-in-time must be predictable and precise. Jitter is a parameter measuring this quality. Thus, jittery clock is undesirable since it reduces the effectiveness of the clock in coordinating other events. However, jitter is not without any use. An obvious example of its applicability is that jitter in a clock signal can help reduce its electromagnetic radiation since it spreads the clock energy. Another not-so-obvious, and more valuable, use of “jittery” clock is to trade the irregularity-in-moment with the flexibility. The flexibility associated with a clock signal refers to its capability of fine frequency resolution and fast frequency switching. When use with care of this irregularity-in-moment, a clock signal can be made flexible by intentionally introducing “controlled jitter” into it. This capability is important for certain applications. Indeed, it outweighs the requirement on clock’s spectrum purity in such applications. Hence, jittery clock is not necessarily a bad thing.

The power of idea
Many times in human history, the power of an idea has changed the landscape of our civilization. Such ideas include liberty, romanticism, Marxism, Zionism, among others. Each of these great ideas leaded to a profound movement that changed the way we live. In science and technology, the latest example of such an idea would be Einstein’s theory of relativity. It links the space and time together, resulting in a thing called space-time. This breakthrough idea, which was regarded as a ridiculous one by most people when it made its debut, is proven to be one of the greatest in human history. This idea is an “anomaly” that later leads to a great paradigm shift in science.

In book “Nanometer Frequency Synthesis beyond Phase Locked Loop”, a new perspective on clock frequency was introduced. While the materials presented in that book focuses on building the circuit at component level, this book will answer the question of how to use it in upper level to create better systems. This book is the continuation in this route of new microelectronic system design methodology. Quoted from Steve Job: think different.

From Frequency to Time-Average-Frequency: A Paradigm Shift in the Design of Electronic System (IEEE Press Series…


Four Things a New Semiconductor Technology Must Have to be Disruptive

Four Things a New Semiconductor Technology Must Have to be Disruptive
by Alex Lidow on 08-21-2015 at 12:00 pm

This post discusses attributes of gallium nitride (#GaN) that make it a disruptive technology and identifies the four factors required for GaN technology to displace silicon as the technology of choice.

Displacing the Silicon with GaN

38 years ago, when I first entered the semiconductor business as a freshly minted Stanford Ph.D., my first project was to develop a transistor that would be better than the aging silicon-based bipolar transistor that was invented in 1947 at Bell Labs by Brattain, Bardeen, and Shockley (They won the 1956 Nobel Prize for this development). My colleague, Tom Herman, and I set out to disrupt this 30 year old technology by using the latest techniques developed for integrated circuits. From this effort, and an incredible team of contributors, came the power MOSFET (We branded ours the HEXFET). It was a disruptive technology, and it did largely displace the bipolar transistor over a period of about 15 years. The dynamics of this transition taught us that there were four key factors controlling the adoption rate of a new semiconductor technology:

[LIST=1]

  • Does it enable significant new applications?
  • Is it easy to use?
  • Is it VERY cost effective to the user?
  • Is it reliable?

    Let’s now address each of these questions individually for the next generation of technology – GaN compared with silicon in the field of power conversion.

    Does it enable significant new applications?
    Some examples of large new applications that are made possible primarily because of the higher switching speed of GaN transistors include:

    • Envelope Tracking: This is a power supply technique that can double the energy efficiency of RF power amplifiers used to transmit all of our voice and data through satellites, base stations, and cell phones. Envelope tracking is accomplished by tracking the power demand precisely and providing the power to exactly fit the amplifiers signal modulation needs. Today, RF power amplifiers operate at a fixed power level delivering maximum power whether or not the transmitter needs it. Excitingly enough, GaN transistors are the first transistors capable of tracking power demands at the high data transmission rates used in 4G LTE network base stations!
    • Wireless Power: Cut the cord! Wireless power transfer enables cell phone, game controllers, laptop computers, tablets, and even electric vehicles to re-charge without being plugged in. A high frequency standard (6.78 MHz) for power transmission is currently being adopted by an industry consortium(A4WP). Silicon power devices (power MOSFETs) do not perform well at speeds this fast, whereas GaN transistors and integrated circuits offer an alternative that switches fast enough to be ideal.
    • LiDAR (Light Distancing And Ranging): LiDAR uses pulsed lasers to rapidly create a three dimensional image of a surrounding area. This technique is widely used for geographic mapping functions and is technology driving (so to speak) “driverless” cars. The higher switching speed of GaN transistors drive superior resolution and response time that enable LiDAR applications beyond the mapping functions to applications such as augmented reality and fully autonomous vehicles.

    Is it easy to use?
    At EPC we designed our GaN transistors (eGaN FETs) to be very similar in behavior to the aging power MOSFETs, and therefore power systems engineers can use their design experience with minimal additional training. To assist design engineers up the learning curve, EPC has established itself as the leader in educating the industry about gallium nitride devices and their applications. As a matter of fact, in addition to publishing over 100 articles and presentations, in 2011 EPC published the industry’s first GaN transistor textbook (in English and Chinese) – GaN Transistors for Efficient Power Conversion. The second edition was published in 2015 by J. Wiley and is available through Amazon as well as textbook retailers. EPC is working with more than 60 universities around the world in order to lay the groundwork for the next generation of highly skilled power system designers trained in getting the most out of GaN technology

    Is it VERY cost effective?

    GaN transistors and integrated circuits from EPC are produced using processes similar to silicon power MOSFETs, and actually have many fewer processing steps than MOSFETs. In addition, GaN transistors do not require the costly packaging needed to protect their silicon predecessors. This packaging advantage alone can cut the cost of manufacture in half and, combined with high manufacturing yields, has resulted in the cost of a GaN transistor from EPC to belower in cost than a comparable (but lower performance) silicon power MOSFET. Today the designer does not even need to take advantage of the higher performance of GaN to realize cost savings in the system!

    Is it reliable?

    To date, tens of millions of hours of stress testing from several manufacturers, and tens of billions of device hours in demanding applications such as truck headlamps, drones, and base stations suggest this technology is capable of performing at acceptable levels of reliability in commercial applications today.

    Summary

    Thus, fast switching speed, small size, competitive cost, and high reliability give the GaN transistor the attributes needed to displace the silicon MOSFET in power conversion applications. Similar analysis show that soon the same will be true for analog integrated circuits. Perhaps in 3-5 years the same will be true for digital integrated circuits. GaN is a relatively new technology and has just begun its journey up the learning curve!

    Also read: GaN Technology for the Connected Car


  • Mentor 2Q Results

    Mentor 2Q Results
    by Paul McLellan on 08-21-2015 at 7:00 am

    So it was Mentor’s turn yesterday after Synopsys on Wednesday. And yes, it really is the end of their second quarter. They produced some very good results. As Wally opened:The second quarter of fiscal 2016 was one of record for Q2. We substantially exceed our own expectations was revenue of $281.1 million and non-GAAP earnings per share of $0.36. This is largely the result of booking fiscal year 2016 expiring contracts earlier than planned. The majority of this upside was the result of users requiring more software than anticipated in their prior contracts.

    Design to Silicon, which includes Calibre, grew 105%. One thing I hadn’t thought of is that in the questions Wally was asked whether they were worried about piracy in China for software and hardware. Firstly, he pointed out, that it would be pretty difficult to knock-off their hardware in any reasonable time (and by there would be a new generation). But he also pointed out that Calibre is used so late in the design process than people are just not going to risk using an unlicensed copy and risk their chip failing.

    During SEMI Gartner seminar I wrote about China. It seems that the big investment program there is having some effect. Wally, again:In addition, my meetings with executives at China-based manufacturing established companies as well as principles at China IC investment funds make clear that the $20 billion China IC development program is stimulating increased R&D and other investment that will benefit the EDA industry.

    Another area of growth has been emulation. This is a relatively new area and all of Mentor, Cadence and Synopsys (and no startups any more that I know of) have the technology, it has doubled over the last 5 years. It is increasingly seen as essential to getting a chip design done. In the questions Wally said he thought it was on track to being a $1B business on its own (not for Mentor alone, for all 3).

    Both Lip-Bu and Aart have commented on consolidation in the customer base as being a possible headwind. Wally is more sanguine (and he has some data…Wally always does):One other topic that comes up in many of my meetings is concerned about the increased amount of M&A activity in the semiconductor industry, while the actual number of mergers and acquisitions is up only modestly in 2015 compared to prior history the magnitude of announced deals is up dramatically. Historically, these changes simply add to the strength of the standard leader in EDA in each tool category so there is some wins and some losses for every supplier. But the other concern is that the total R&D spending of the semiconductor industry could be reduced. That is of course possible, but for more than 30 years semiconductor R&D has averaged a constant 14% of revenue despite lots of structural changes.

    In the questions he gave some more color, pointing out that semiconductor R&D is much less volatile than revenue, that can be affected by all sorts of things. But semiconductor companies don’t lightly lay off their design engineers, they will need them when the next upturn comes and they will need product already in the pipeline.

    There were some surprises, at least to me, in the geographies. Greg, during his part of the call, said:By geography bookings were up in two of our four reporting regions. Europe was up 40% driven largely by semiconductor customers and products. Pac Rim up 15% on strength of foundry and emulation business, the Americas was flat and Japan was down 50% as its consumer and IC electronics business continues to shrink.

    Say what? Europe up 40%, Japan down 50%. OK Japan has been weak forever, but Mentor has been reducing headcount in European sales due to weakness. They have closed some major European auto deals so I’m guess the upside might be there. After all, automotive is 15% to 20% of Mentor’s business these days. But those percentages mean Mentor’s business is now 40% North America, 25% Europe, 30% Asia-Pac and just 5% in Japan (at least this quarter). Mentor has been reducing its salesforce especially in Japan and you can see why.

    I didn’t manage to find as good a picture as the one of Aart playing guitar at the San Jose Jazz festival a couple of weeks ago. Oh wait, what is the second image that Google Image Search finds for “Wally Rhines.” OK, that’ll do.

    SeekingAlpha transcript of the call is here. There is a huge error in transcription. Mentor announced a dividend of $0.055 per share, but SA has $5.05. That would be nice on a $25 share.


    Older Nodes Get New Life With Ultra Low Power Variants for IoT

    Older Nodes Get New Life With Ultra Low Power Variants for IoT
    by Tom Simon on 08-20-2015 at 12:00 pm

    Ever since I can remember, and I’ve been in EDA since the early 80’s, new process development has largely focused on the latest nodes. Trailing nodes were quickly put into support mode. New nodes benefited the most from static and dynamic voltage reduction efforts, as well as improvements in flows and performance. Only a small number of niche processes, usually produced by smaller captive fabs, were tuned over time for improvements. But the IoT has changed this.

    With projected volumes for IoT chips in the billions, foundries, EDA and IP vendors have put a new emphasis on revisiting their offering for larger nodes. The biggest motivation for this is the need for lower power and the proliferation of wireless. When we say lower power, it’s not about needing fewer cooling fans, its about running for months on solar power, or making a wearable last for weeks before it needs to be recharged.

    For wearables, using a larger battery is not an option. A typical wearable LiPo battery might have less than 20 milliamp-hours at 3.7V. Sleep modes need to be in the uA, not milliamp range. Every trick in the book is needed: voltage Islands, power islands, low leakage libraries, sub threshold operating voltages.

    Apparently TSMC has been thinking about these issues for a while and concluded that updating processes alone will not solve the power problems faced by new products. So TSMC has announced the development of IoT platforms with several of their OIP partners. For its part TSMC is rolling out ultra low power (ULP) versions of its 0.18u, 90nm, 55nm, 40nm and 28nm processes. Several of them will come with embedded flash and the ability to support radio design.

    TSMC expects the ULP processes to reduce operating voltages by 20% to 30%. That combined with standby power reductions promises to offer 2X to 10X increases in battery life.

    TSMC has announced that the following partners are participating:

    ARM – IoT subsystems for the Cortex-M and Cordio radio IP. Running on the 55nm ULP process, they can run below one-volt, saving significant power.

    Cadence – Also targeting 55nm ULP, they offer Tensilica Fusion DSP’s for sensor and peripheral interfaces operating at optimal power levels. Tensilica cores are available for WiFi/IoT connectivity for wearables and other IoT applications. 40ULP and 28ULP are also available.

    Dolphin Integration – Bringing ultra low power methodologies and flows for designing ultra low power designs that include voltage and power islands. They are providing tools to effectively reduce dynamic and static power for designs targeted by the TSMC partnership.

    Imagination – IP for ultra low power designs. They are providing processor cores, wireless and other ancillary functions implemented as reference IoT subsystems. Imagination offers comprehensive IP for building a large number of IoT applications.

    Synopsys – Working on an integrated IoT platform on TSMC’s 40nm ULP process. This will include a broad range of DesignWare IP. The highlights are the ultra low power ARC EM5D processor core, power and area optimized libraries, memory compilers, NVM as well as a number of IO and sensing blocks.

    All of this represents a large commitment on the part of TSMC and their partners to create the processes and flow enablers necessary to fulfill projected design and volume demands fron the explosive growth of ultra low power connected designs for the IoT.

    For more information on applications for different process nodes look on their site.


    Synopsys Q3 Results

    Synopsys Q3 Results
    by Paul McLellan on 08-20-2015 at 7:00 am

    Synopsys announced their quarterly results this afternoon. It is the end of their Q3 (yes, they are not on the regular calendar year. Neither, for that matter, is Mentor who announce tomorrow). On the earnings call Aart started off:Good afternoon. I’m happy to report that our third quarter results were very strong, as we achieved revenue of $556 million, non-GAAP earnings per share of $0.63 and $275 million in cash flow from operations. In addition, we closed several key acquisitions, as we continue to strengthen and evolve the company for long-term growth. As a result, we are again raising our annual revenue guidance.

    That is up 6.5% year on year, and in their guidance they said they expect 10% growth for the whole year. Some of that is Atrenta; in the questions someone asked whether that was in this quarter and not last year’s compare. But Aart sees at least signs of clouds on the horizon, which isn’t surprising given what looks like increasing weakness in China. He continued:Characterizing the customer environment around us, the semiconductor and systems industry results and outlook remain mixed. Customer growth rates appear more challenged than three months ago as some customers thrive while others struggle.

    Like Lip-Bu on the Cadence call, he also pointed out that consolidations are continuing in the semiconductor (aka customer) space, which tends to be a headwind for EDA. Somehow an acquisition like Avago and Broadcom never seems to spend as much as the two companies did separately before. But at least they continue to invest in advanced designs.

    Aart claimed that the number of active FinFET designs and tapeouts to-date is now nearly 240 and that Synopsys is “relied on” for 95% of these. That makes it sound like everyone is using DC and ICC2. I’m sure that is true in that some Synopsys tools are used but I don’t think Cadence would be close to making its numbers if it was only in 12 of those 240 designs. For ICC2 in particular Aart said they have 38 customer logos with over 100 production designs and tapeouts, way up from last quarter.

    See also Antun Domic, on Synopsys’ Secret Sauce in Design

    As I talked about when I interviewed John Koeter, Synopsys have been making a big push into automotive electronics. Aart talked a bit about that:In June, we rolled-out a broad set of IP optimized for automotive chip development. The portfolio now meets key safety, reliability and quality requirements while continually being enhanced to address new emerging standards. We have also worked with industry leaders such as Freescale, Infineon and Renesas to create Automotive Centers of Excellence with our virtual prototyping products, enabling our mutual customers to accelerate software development.

    See also John Koeter: How To Be #1 in Interface IP

    I spent a lot of time especially at VaST (which Synopsys acquired) trying to get virtual prototyping products into automotive companies and their suppliers, with some success. But automotive moves so slowly, working on model years that are so far out that as a startup you run out of money before the products proliferate (which, to be fair, is not only a problem with automotive. More startups fail by being too early than by being too late, it seems to me).

    See also DAC: Self-driving Cars

    Aart is proud of Synopsys’s timing in entering the security with first the acquisition of Coverity and subsequently several smaller security companies and technologies. I think he is right and it will turn out to be significant. Apparently at the Black Hat Conference a couple of weeks ago there were speakers from Underwriter’s Laboratories and the Department of Homeland Security. UL spoke about its developing Cybersecurity Assurance Program and the collaboration with Synopsys to drive it forward. Security makes for strange bedfellows. Gartner picked them out too, putting Synopsys in the magic top right quadrant for application security testing. There are hundreds of companies on the matrix but less than 20 in the top right box.

    See also Synopsys’ Andreas Kuehlmann on Software Development

    Transcript of the call at SeekingAlpha is here.


    My Tryst with Semiconductors and EDA

    My Tryst with Semiconductors and EDA
    by Pawan Fangaria on 08-19-2015 at 4:00 pm

    Yes, today I realize it feels like a tryst with semiconductors. In actual meaning; it wasn’t a love affair with semiconductors, but I must say the greatest thing it taught me about how it approaches towards perfection. And that became the guiding principle in my life; how can I do something better. Of course nothing is perfect in life and in science, as far as I know, however things can always move towards perfection. The semiconductor manufacturing professionals can tell how they strive to gain perfection in moving to newer and newer technology processes.


    When I was in primary level school, I was told that light moves in a straight line; in secondary level school, it still moved in straight line, but it could refract and deflect. In college level, I learnt that light has waves and it is actually not a straight movement, of course we cannot see those waves. Now when I realize about these waves’ real implications during semiconductor manufacturing at the nanometer scale, it’s inspiring, that’s closer to perfection. The perfection is not over yet; now we are talking of EUV, 7nm and 5nm process.

    Let me reflect a little on my encounter with semiconductors and EDA. During initial schooling I was fascinated towards aeronautical engineering; computer was not known to me at that time. But during undergraduate college level, I had chosen physics as my major subject and I was attracted towards solid state physics. Computers (PCs) had arrived during my engineering at Indian Institute of Science. And there were chips for CPUs, memories,… Seeing VLSI as a new, emerging field and its close to perfection designs and processes, I chose VLSI as my specialization subject.

    In job, again it was VLSI (CAD) division of an Indian public sector company, ITI. Being a fresh grad out of college, I remember how I had to strive to get me assigned to that division. One can imagine what kind of salary one could expect from an Indian public sector company at that time, 1990; in fact my father ridiculed me on my salary! But, in my personal opinion the life at ITI was great, a golden period when I wrote several tools for gate array, full custom and standard cell based designs and learnt a lot. We, at ITI, had a 3 micron foundry at that time, and I remember how secluded it was and how perfect, dust proof environment, chemicals, and equipments had to be maintained. So, ITI was a true, perfect catalyst in my professional and working life for my ‘tryst’ with EDA and semiconductors :). We used to review at least 10 best papers from IEEEjournals, DAC proceeding, and so on before implementing anything in our tools, so that was a real fun. That’s when my admiration grew about these global technical institutions and forums.

    At ITI, in my ~5 years, we had learnt the concepts of HLS, but it was only at the conceptual level. Implementation of logic synthesis was just starting and we saw Synopsystaking world wide lead into that area. I moved to Duet Technology (long ago acquired by Motorola, now Freescale) and then to Cadence in 1997. At Cadence, it was a real eye opener for me to the world semiconductor and EDA industry. There were many things to learn – technology, tools, business, management, customers, strategic relations, partnerships, and so on. During my life at Cadence, I have seen how we have gradually moved the level of abstraction up at several stages of design to address manufacturing issues – DFM, DFY, and so on. That’s when I realized we could do lithography awareness at the floorplanning level; that gave me a patent awarded at that time!

    Today, at sub-20nm we are seeing designs becoming highly vulnerable to variability effects. The perfection at 90nm or 45nm no more works at 20nm. So, the scale of perfection in designs has to be improved further to get a working chip discounting variability. And we are seeing tools accounting for variability effects now; specialized formats have evolved to specify variability.

    Semiconductor and EDA has been an interesting discipline in my life which tells me that nothing is perfect, but you can make it perfect to work within a set of parameters and that can go up to the levels of electrons and protons. That’s physics, that’s engineering! Nothing is absolute, absolutely!

    Pawan Kumar Fangaria
    Founder & President at www.fangarias.com


    Seeing Firsthand How the Internet has Changed Traveling

    Seeing Firsthand How the Internet has Changed Traveling
    by Tom Simon on 08-19-2015 at 12:00 pm

    We hear a lot of talk about the internet improving our lives, but most of the time this translates into time spent on FaceBook, shopping on Amazon or other distractions. However, on our just completed trip to Europe I discovered how mobile internet connectivity can transform the experience of traveling.

    At home when I drive places I use Waze. It is extremely useful for getting directions and also avoiding traffic on routes you know well. If you are not familiar with Waze, it is a crowd sourced traffic app with real time route planning to avoid traffic jams. Also I occasionally use Google Maps to find shops, restaurants, etc. At home these things are nice to have. Believe me they help, but they are not what I would call essential.

    Now let’s go back the early eighties when I traveled in Europe as a young backpacker. There simply was no finding a wifi hotspot and checking in on Facebook so your friends know where you are. I remember looking for the Bundespost so I could wait in line to make a phone call back to the US. I did this about once a week so my parents would know I was OK.

    Even though my present day mobile carrier wanted a fortune for calls from Europe back home, I was able to use the Google Hangouts Dialer to make calls for free to the US as long as I had decent wifi. I use Google Voice for all my calls and I was able to just forward my incoming calls to Hangouts to avoid cell minutes – all on my phone.

    But the big win was finding and getting to places. It used to be that if you needed to use a subway to get somewhere it was some work but totally doable, however buses and sometimes even trams were a bit of problem. Is this the right bus? Is this my stop? With GPS and Directions in Google Maps all of this became child’s play. Even parsing the Paris metro no longer requires assuming the self-identifying tourist-posture in front of the system map. This is an important tip for avoiding becoming a pick pocket victim. Everyone else is looking at their phones too – it makes it easy to “fit in.”

    One of the best examples of how empowering having all the train schedules and transit info in the palm of your hand, while traveling in a country where you do not speak the language, is when the unexpected happens. Three times in our most recent trip we had trains stop mid-route because of air conditioning failures due to the extreme Summer heat.

    Invariably we were told to get off in some small town, where we were left to figure out a new route. At one point we were in a crowd of about 100 people forced off a train and told to wait in 90+ degree temperatures for a bus to take us to our destination. However, it seemed no bus was coming for us. Then a local bus came by and people rushed it only to realize it was not the ‘replacement’ bus. Using Google I determined that we could use this local bus and then a train to get to Linz in time to catch our connection to Halstatt.

    The best part was that I had to convince the German speaking bus driver that our plan to get to Linz would work. Our uncrowded and air conditioned bus pulled away from the milling crowd, with us safely on our way to our next destination.

    Without the mobile phone, internet connection and Google Maps, we would have been left in the sweltering heat wondering if and when a bus might come, and then have to jam onto it with dozens of other stranded travelers.

    This only scratches the surface of just how useful connectivity was on the trip. In a subsequent post I will talk about navigating to hotels, restaurants and water holes. One thing that is certain is that we did more, saw more and fretted less about navigation on this trip than any previous trip I have taken overseas because of the information at my finger tips.


    Only One Type of OEM Seems to Work in EDA

    Only One Type of OEM Seems to Work in EDA
    by Paul McLellan on 08-19-2015 at 7:00 am

    OEM agreements don’t seem to work in EDA. Sometimes they are signed but usually they turn out to be closer to joint marketing agreements. The reason seems to be that EDA software is complex and requires high-touch support especially when getting the product first installed and introduced into a production flow. The effect of this is that if a big EDA company has an OEM agreement with a small company, then the customer knows they will get better supported by working directly with the small company who is more knowledgable about the product and more focused.

    Another issue is that big EDA companies typically do large deals on a whole bundle of software for a period of time, typically three years. Nobody really knows how much revenue should really be attributed to which product, not even the customer nor the large EDA company. The methodology I’m familiar with is to use a uniform discount: take the list price for all the products, add them up and then apply a discount percentage to get to the deal size. Then use that discount number to each individual product to allocate the money to different product lines. However, doing business that way doesn’t really work with OEM products that have some sort of royalty or license fee associated with them. A few copies of an OEMed product just complicate the deal. EDA salespeople are notoriously risk-averse since they make most of their money rolling up big deals to sell companies software that they had in the previous deal. So the last thing they want is something that slows down the deal closing. In fact, it is often hard to get them to put their own new products into the deal unless the customer is begging for them because.

    A further issue is at the strategic level. Let’s say a big EDA company has a hole in its offering that another company fills and signs an OEM agreement. Either the big company doesn’t sell much in which case why bother, or they sell a lot meaning the hole is really important and they should probably do an acquisition. But now they have to buy back their own revenue, or worse, another big EDA company pays even more and they are back to having a hole again.

    The one type of OEM agreement that does seem to work is for components. The one I am most familiar with is Concept Engineering. Many EDA companies need a schematic viewer, that takes the netlist (or the netlist data-structure) and produces an attractive graphical representation with the cells neatly placed and the wires neatly routed. Every EDA company (except Synopsys, who I guess needed a schematic viewer to even develop Design Compiler) uses the viewer from Concept Engineering. We used it at Ambit years ago, and then continued after we were acquired by Cadence. I’ll be surprised if Cadence’s new Genus synthesis tool doesn’t use it still. Concept Engineering has been delivering this technology for 25 years now.

    The reason this type of OEM deal works and product ones do not is that it is a component of the product. The customer buys the product and gets the component included. The salesperson doesn’t worry about the component slowing down their deal. The customer doesn’t worry about trying to get special support from the original creators.

    Concept Engineering also sells tools in the StarVision suite based on the same underlying technology. These tools are primarily used by companies importing IP to look at the blocks and gates and transistors to understand the IP quickly to be able to use it in a design or, perhaps, make modifications. But this is a product that those design groups want to get directly from Concept Engineering and its distributors, not through a large EDA company. And so that is what happens. It is almost the perfect exception to prove the rule that OEM agreements only work for components.