RVN! 26 Banner revised (800 x 100 px) (600 x 100 px)

TI’s Way of Strategies – Formation & Execution

TI’s Way of Strategies – Formation & Execution
by Pawan Fangaria on 02-26-2014 at 8:30 am

For a company to stand still and continually prosper even after facing several downturns in its career of 80+ years, and still move swiftly with strong commitment and confidence, its strategy has to be right and rock solid possessing sustainable competitive advantage, and of course it has to be an early mover in everything it does with a determined point-of-view. That’s exactly what my close observation tells me about Texas Instruments. What’s more important in the formation of a strategy is – how much early it is spotted and how swiftly it is acted upon. And even more important is its implementation and fast execution. At times strategies can go wrong, but if executed fast enough, one can quickly smell the results and change it as appropriate; however one needs to have the knack of sensing that direction.

In my analysis of TI from the days of its incorporation, I have found that it’s very apt in early anticipation of future movements and directions and moulding itself (before anyone else does) in delivering on those. Its strategies are motivated by external opportunities and fuelled by internal resources and capabilities built up over time; hence TI generally has mixed strategies. Look at their 41000+ patents and timings of some of the prominent ones among them, and then some of the key developments based on those patents, even the developments based on external licenses. Credit goes to the visionaries in this company.


[TI’s R&D Kilby Center]

After replacing vacuum tubes with transistors, TI was quick in spotting the opportunity of ICs in re-shaping the electronics industry. While serving for U.S. Defense organization’s electronic need, it also built up semiconductor IC manufacturing capability to avail that massive opportunity in consumer electronics. Today, microelectronic manufacturing is one of the strong core competences of TI that fuels many of its businesses. TI spends about 12.5% of its revenue in R&D that includes basic research, new product ideas and other far fetching environmental areas such as smart grid and healthcare; to keep an eye on major future innovations and developments.

Let’s look at some of the strategic moves TI has taken over a long course of time, and it would be interesting to watch at the speed of their execution. While working for U. S. Defense Department, TI tasted its first fruits of electronic success with silicon transistors, portable radio, ICs, first computer for U. S. Air Force, calculator, and the like. Looking at a bigger opportunity in semiconductor electronics, they negotiated with IBM in parallel for the later to start using TI’s electronic components into IBM computers. In 1970, microprocessor chip was developed. Within ~10-15 years of IC’s birth, the semiconductor industry had grown to multi-billion-dollar industry, and TI being an early mover into this business created a prime spot for itself. TI entered into all sorts of businesses involving microprocessor controlled devices which included industrial applications as well as consumer electronics. It even introduced onetime favourite digital watches (at $20) in 1976. It also came up with speech synthesizer devices for educational aid. It opened semiconductor manufacturing facilities across the world; to start with in 1957, its first facility outside U.S. in Bedford, U.K. to supply semiconductor based electronics to Western Europe and then in other parts of the world.

By the time competition had moved in, around 1975, TI had to face price war, particularly by Bowmar Instruments in calculators and then other Asian manufacturers of consumer electronics. It started losing market in calculators, watches, LCD etc. While it was losing these businesses, it also lagged in fulfilling orders for semiconductor chip manufacturing (the cash cow). Some of TI’s strategies didn’t work as desired, mainly due to pricing; price skimming strategy didn’t work anymore. In late 1979, TI introduced home computer (at $1400), but soon lost the business due to price war, and in 1983 TI made its first loss instead of profit. The semiconductor slump during early 1980s added to the misery of TI when it had to reduce its work force by ~10000; financial losses continued until 1985. This is when TI spent no more time in realizing that it needed to focus on high margin businesses attached to its technological edge and core competences.

In the leadership of Jerry Junkins, from 1986, TI took a major strategic turn. It re-focused on its innovation spree and manufacturing competence to focus on high-margin custom microprocessors and DSP (Digital Signal Processing) cores instead of low-margin chips. TI also initiated a method of earning royalties on its patents’ licenses, initially by filing suits against DRAM manufacturers who were selling DRAMs without obtaining licenses from TI. Junkins also initiated collaborative businesses (called B2B) with major players such as Hitachi in Japan, Sun Microsystems, General Motors, L.M. Ericsson, and Sony and alliances with Acer, HP, Canon and governments of Singapore, Italy and others. By 1993, TI’s revenue again swelled with almost 60% increase in per employee revenue, led by custom and speciality segment of electronic components and also software. During this time, TI had noticed a major lucrative business in DSP area and invested heavily into building this capability. In 1994, it came out with the first single chip multimedia video processor (which can be termed as an SoC that combined multiple DSP and RISC chips). DSP became another core competence of TI gaining major market share and expansion. While TI acquired many DSP companies, it sold its low-margin memory business to Micron. Smart move!!

TI continued to keep its top position in analog products market and entered the new millennium with a solid, slightly less than $12 billion revenue and ~22% profit margin. The company sustained the worst downturn of semiconductor industry in 2001-3, although with falling share prices. In the new millennium, its focus turned towards wireless handsets and DLP (Digital Light Processing) technology. That is when TI convinced Nokia to use its DSP chip as core in cell phones before Nokia became the champion of cell phones. Rich Templeton led TI into the cell phone revolution through its wireless phone chips. In a crude sense, we can say – when Intel pushed TI out of computer chip business, TI spotted cell phone opportunity which later started pushing down PC market in general, i.e. like remaining ahead in the game! Sometime later, I will talk more about TI’s reaping of rich benefits from wireless and cell phone (smartphone) business until it recognized, well in advance, the maturing smartphone business and coming out of it, as usual, much in advance than anyone else.

In wireless area, TI continued with increased focus on embedded processors for sustainable growth. And continued focus on its core, analog semiconductor business. It will be interesting to watch TI’s moves in MEMS business, the next big thing for IoT (Internet of Things) business. TI is top revenue maker in MEMS business, ST being the main rival. In CES 2014, TI’s new DLP chipset technology kept the eyes rolling. This DLP chipset makes it possible to develop pocket-able projectors with great lumens and resolution; Sekonix will be coming up in this year. DLP chip invented in TI can have an array of up to 8 million microscopic mirrors. Used in office projectors, cinema projections, IMAX, TVs, mobile displays and several other industrial, automobile, medical and security applications, DLP technology is one of the major strategies of current times in TI. I will talk more on DLP technology later. And also about power management as TI keeps close watch on smart grid, energy harvesting and the like developments in the near future.

Okay, I would like to wrap up this article here with some closing remarks. Most of the times, TI has been successful in microelectronics semiconductor manufacturing (its core competence) and its related diversification. That has led TI to build other core competences as well, such as analog and embedded processing, DSP, DLP and so on. As TI’s sharp focus on upcoming big things and speedy build-up of expertise in those areas has always crowned it with victory, it would be interesting to watch how it plays in IoT revolution as that will infiltrate most of the verticals such as home & consumer, automotive, aerospace, healthcare, industrial, military, retail, security & surveillance, entertainment and so on. And TI already retains a dominant position in MEMS business. Rich Templeton is a sharp, forward looking leader; it appears that he has his strategy set for the new era of “number of devices per person” instead of out dated “number of persons per device”.

More Articles by Pawan Fangaria…..

lang: en_US


A Brief History of Chip Design at Apple Computer

A Brief History of Chip Design at Apple Computer
by Daniel Payne on 02-25-2014 at 9:36 pm

Steve Wozniak in 1976 designed the Apple 1 while working at HP during the daytime, and he used standard parts to keep costs low, like:

  • 6502 CPU from MOS Technology
  • 8K of DRAM
  • TTL logic for driving video and random logic
  • PROM to hold the BASIC language and primitive OS

Continue reading “A Brief History of Chip Design at Apple Computer”


Getting an MPW Quote on My iPhone

Getting an MPW Quote on My iPhone
by Paul McLellan on 02-25-2014 at 12:00 pm

As I blogged about recently, eSilicon have completely automated the quote process for their MPW shuttle service. You can use an online interface that runs in the browser but there is also an app that you can download from the App Store.

So I decided I had a few million dollars to burn and I’d get myself my very own TSMC 20nm parts. So I start by filling in the process. MPWs are somewhat granular and the foundry has a “block size” which represents both the smallest die you can purchase and also the unit that will be used to round up the die size. I have an 8x8mm die so 64mm[SUP]2[/SUP]. I’m going to put it in flipchip BGA packaging so I need to pick that and the die need to be bumped.


Since I’m going to do extensive characterization, let’s get wafers from 5 process corners (typical, fast-fast, fast-slow, slow-fast and slow-slow). I seem to get 100 of each.


That’s it. Request the quote and a few seconds later there it is.


Looks like it is going to cost me around $3.5M. I think I might hold off on signing this signature page for now since I haven’t started the design. Not to mention not having $3.5M lying around.

And here’s what I have to supply eSilicon so that they can have it manufactured.

Wow. That was pretty simple. I can now iterate and see what it would be if I used 28nm instead of 20nm. The die will be 10mm on a side now. Let’s not package the die. And not bother with anything other than the typical corner.


Just $1.7M now. Half the price.

You can play this game at home. Just sign up on the eSilicon website here.

More articles by Paul McLellan…


Skate to where the mobile puck is headed, Intel

Skate to where the mobile puck is headed, Intel
by Don Dingee on 02-24-2014 at 5:00 pm

Mobile World Congress 2014 has already showcased two very different mobile SoC machines in high gear. After watching one big US moment and Canada otherwise dominate everything involving ice and a stick at the Sochi Olympics, I’m reaching into the Wayne Gretzky pile of quotes for a metaphor to examine Intel’s move – and why they are still doomed in phones unless Qualcomm slips and falls.

Intel has reached into their bag of tricks and targeted Qualcomm dominance in two ways: announcing a category 6 LTE modem, and the 64-bit mobile application processor roadmap. In both cases, their problem is clear: they appear to be skating to where the puck is right now, not where it is headed.

AP photo: Bruce Bennett

The XMM 7260 is Intel’s attempt to get into the LTE chipset game, but long story short: they are already a node behind at debut, on a TSMC 28nm process compared to Qualcomm’s Gobi 9×35 which will be on 20nm at about the same time. You read that correctly: this key piece of Intel strategy is fabbed at TSMC. Intel’s only play here, surprisingly, is price; however, as soon as they wander into stand-alone LTE modem space they will find difficulty against single chip SoCs with LTE integrated.

In the other news is Intel’s worst kept secret: Merrifield, today launched as the Intel Atom Processor Z34. Weighing in with dual 64-bit Silvermont cores (stolen at least in concept from Bay Trail) at up to 2.13 GHz, with an Imagination PowerVR G6400 GPU at 533 MHz, Merrifield gets onto 22nm, enabling Intel to bring its fab power to bear. Close behind, Moorefield – Atom Z35 – ups the ante to 4 Silvermont cores with doubled cache at up to 2.33 GHz, and the PowerVR G6430 GPU. Details are in the excellent AnandTech post on Merrifield.

In a stunning change of behavior, Intel is now directly and publicly comparing themselves to Apple, showing Merrifield benchmarks 16% better than Apple A7 – but 9 months behind in availability. This is interesting because it is pretty clear Intel isn’t going to win over Apple anytime soon, and the best they can hope to do is win some faceoffs with Qualcomm.

Which brings us to the Qualcomm news: the Snapdragon 610 and 615, driving 64-bit into their mid-range offering. The 610 is a quad-core ARM Cortex-A53, an Adreno 405 GPU, and an integrated 9×25 cat 5 LTE modem. The 615 is a curiosity in and of itself: “octa” in name, but really two clusters of quad ARM Cortex-A53 cores. While Qualcomm works on a homegrown 64-bit Krait likely destined for their high end parts, they are dragging some pretty advanced features into the mid-range. A bit more info is in this ArsTechnica post on the announcement.

Intel keeps using the word “competitive” and mobile in the same sentence, but in fact Qualcomm has them surrounded and contained into the mid-range at best. The shoe is now on the other foot: the same strategy Intel used to engulf PowerPC and drive it into PC extinction is now being used against them in mobile. It must be very easy for Qualcomm teams to show their entire roadmap to OEMs and win right now, and I don’t think Intel will be very competitive when the rest of the high-end Qualcomm roadmap comes to light later this year.

Undeterred, Intel keeps skating – there’s a 14nm Broadwell on the vision map, but they have to be very careful not to Osborne themselves by holding back details on how that projects to mobile. Unfortunately, if they keep coming out with essentially one mobile chipset a year (bumps not counted) and very few design wins (and there weren’t any announced today), they are going to find themselves shaking hands at the mobile red line very soon. They have to skate to where the puck is headed, not where it is right now, to have a chance.

lang: en_US


SoC Functional Verification Planning and Management Goes Big

SoC Functional Verification Planning and Management Goes Big
by Daniel Payne on 02-24-2014 at 10:01 am

Big SoC designs typically break existing EDA tools and old methodologies, which then give rise to new EDA tools and methodologies out of necessity. Such is the case with the daunting task of verification planning and management where terabytes of data have simply swamped older EDA tools, making them unpleasant and ineffective to use.

Last week I spoke by phone with John Brennanof Cadenceto learn about their decision to develop a totally new EDA tool for SoC verification planning and management. This is a product area familiar to Cadence users with a 10 year history of the Incisive Enterprise Manager(IEM) tool. The new tool is called Incisive vManager and it was designed to handle the biggest SoC verification tasks by using:

  • A client-server approach
  • Sophisticated verification management
  • A scalable, database-driven technique

With the old way you had to comb through reams of data to see if you can optimize your verification, while with the new way you collaborate with all team members throughout the design and verification process, where everyone has easy access to the progress. Benefits of using the new approach include about a 2X improvement in functional verification efforts, once you get fully trained.

Reducing verification times and improving coverage are a big deal because the typical SoC at 40nm had a $38M verification cost, from data supplied by IBS 2013. For projects using 20nm, that verification cost can be $100M.

A project manager really wants to know a few things:

  • What is my schedule until DV is complete?
  • What are my costs to reach tape out?
  • Are there any functional bugs that will cause a re-spin or recall?

With vManager you can improve schedule predictability, verification productivity and design quality. Here’s a closer look at how this happens. By having all of your functional verification metrics visible in a GUI you can work on the most critical failures first:

If there’s a block on your design with a low test grade, then the manager can shift verification resources to get caught up:

Failure analysis determines which failures are the same, and helps identify only the most critical, thereby eliminating redundant cycles:

Finally, for optimizing your verification plan there is benefit to finding precisely where your coverage holes are, so that you are quickly aware and can take early action:

Verification productivity improves by:

  • Up to 30% using reporting automation and closure automation
  • About 25% better compute farm utilization with MDV (Metric Driven Verification) versus directed testing
  • Up to 10X improvement in bug discovery with MDV versus directed testing
  • A 60% reduction in verification time with MDV

The vManager tool flow has the following components:

If the vManager approach looks interesting, then you can learn more by attending a two day workshop, followed by your own evaluation.

In summary, you should consider this new generation tool if your current generation tools are limiting the number of runs and coverage nodes required:

If you already use Cadence tools for your functional simulator, formal and hardware acceleration then give vManager a look. This new tool has been used by ST and others over the past year, so it sounds field-tested to me. If you visit DVcon in March then consider attending a Cadence tutorial or attending a customer paper presentation. Should you be tempted to write your own functional verification management environment then expect to spend about 50 man-years of development effort and ~2 million lines of code to catch up to vManager.

lang: en_US


More things on the DSP frontier at MWC14

More things on the DSP frontier at MWC14
by Don Dingee on 02-23-2014 at 12:00 pm

With a well-chronicled share inside cellular baseband interfaces for mobile devices, one might think that is the entire CEVA story, especially going into Mobile World Congress 2014 this week. MWC is still a phone show, but is becoming more and more about the Internet of Things and wearables, and CEVA and its ecosystem are showing solutions for these spaces.

One of the unique features in the Samsung Galaxy S4 was “smart pause” – pausing content playback when the user is distracted and looks away from the screen. This offers convenience so nothing is missed; it also serves as a power-saving feature. That same concept goes into facial activation, pioneered by Visidon and enabled with the CEVA-TeakLite-4 sensor fusion capability. Offloading the facial activation algorithm to a low-power DSP core means considerable power savings in a device, allowing the capability to be always-on in background waiting for the user. Visidon AppLock (available for Android in Google Play) also serves as a biometric security mechanism, looking for a particular user’s face before allowing access to an app.

Similarly, the popularity of Nuance Dragon, Apple Siri, Google Now, and Xbox One Kinect voice commands are driving user expectations for that kind of speech recognition capability in everything – even small, low-power devices. Again, the ability to remain always-on listening for voice commands calls for a low-power DSP core, and the CEVA-TeakLite-4 comes into play in the latest implementation of Sensory TrulyHandsfree Version 3.0. Sensory has some unique algorithms allowing users to be as far away as 20 feet while delivering commands, and the ability to filter out background noise, claiming 95% accuracy without false fires. Sensory has traditionally offered their own processing silicon, but teaming with CEVA allows the capability to be offered directly in CEVA-enabled SoCs.

Even more powerful solutions are using DSP in devices combined with the cloud to provide emotion recognition. The basic use case is to gauge reaction to content, while the user watches on a mobile device, game console, or digital signage platform with a front-facing camera, as the stream plays. Advertisers, content producers, political pollsters, and others can determine not only if their message was viewed, but how the viewer feels about what they see and hear without need for the user to respond to an overt poll request. CEVA has partnered with nViso to bring facial micro-expression recognition software to the CEVA-MM3101 vision platform, again with the implementation taking a fraction of the power otherwise needed. This embedded vision platform is integrated with CEVA’s Android Multimedia Framework – our own Eric Esteve provided background on AMF previously.

CEVA has even more on display in Barcelona at #MWC14; visit http://events.ceva-dsp.com/mwc14 for videos of demos of these and other DSP-enabled applications for mobile, IoT, and wearable devices, and follow @CEVADSP on Twitter.

The theme here is consistent: optimized DSP cores and algorithms can provide a huge power savings, even running complex voice and imaging algorithms, and enable more natural inputs for devices. This capability is going to get a lot more important for wearables, which will not have the luxury of virtual keyboards and larger touchscreens just due to their reduced size, and will need to be very power efficient for operation on small batteries. CEVA and their ecosystem are rising to the challenge and creating new solutions for designers working in these tight spaces.

lang: en_US


The Future of Money is Digital – Part 2

The Future of Money is Digital – Part 2
by Sam Beal on 02-23-2014 at 11:30 am

BitCoin Algorithm
Invented by a mystery person/group with the alias “Satoshi Nakamoto”. [You can read a consolidation of the paper here]. The essential elements are:

· Peer to Peer Network with self-validation
· Exponentially increasing compute cost
· Finite supply with exponential conversion
· Hidden in plain view anonymity

Continue reading “The Future of Money is Digital – Part 2”


Glasses Refocus Mobile Power Design

Glasses Refocus Mobile Power Design
by Bill Boldt on 02-23-2014 at 11:00 am

Contextual awareness likely will emerge as one of the most exciting new user experiences enabled by wearable products. It happens when a mobile device, carried or worn, senses the user’s surroundings and presents information, offers advice, or controls itself and/or other devices according to that specific environment. This new experience will influence how users see the world, affecting their interactions with other types of screens, some of which have not even been invented. New styles of personal screens such as smart glasses and smart watches are popping up like weeds now.

A screen that you wear is more natural and personal because it becomes an extension of you unlike a smartphone you carry. So look into glasses platforms and you just might see the future, at least the future of the mobile platform. With glasses an image can be projected right into users’ eyes, overlaid on their view of the real world. This merging of real and virtual realities will create a bit of an Alice in Wonderland world where reality is altered like never before.The first likely major use of augmented reality will be for location-based services. Descriptions and information about the user’s current neighborhood, a museum exhibit, the building, the product the user is looking at, or the person the user is talking to (which is sort of creepy) will be available at the literal blink of an eye.



Connected glasses can provide context for people you meet.

Major mobile handset makers are already patenting contextual awareness methodologies. The implementations are rudimentary right now, but point to a future of augmented reality. Augmented reality is all about creating a more natural interaction with the digital world while living an analog life. Simple contextual awareness applications could include a phone that recognizes what the user is doing and present options and services that make sense at that time. Just imagine a smarter Siri that knows where you are and suggests what you might want to do. Now imagine a phone that has learned your particular gait when walking and senses whether someone else has taken your phone and locks it down until that person is authenticated. There is a smartphone already on the market that keeps the screen lit while the user has eye contact with it.

These new platforms present many new opportunities to pursue innovation in miniaturization, power management, connectivity, sensing, and control. Their requirements play right into the hands of semiconductor innovator. But innovation is more than just a fashionable slogan or something that magically appears after taking a design class at some prestigious university. Innovation requires the ability to pick up signals of the future out of the noise of the present and most importantly not be deluded by one’s own biases. A dirty little secret about shaping the future is that the look of the future is literally created by industrial designers. So, it is a great idea to start there for clues. Perfect examples are wearable smart glasses, smart watches, fitness bands, and other wearables. These new form factors are on their way and apply a lot of stress on a product’s physical design. Such stress leads directly to the need for innovative ways to route power inside a much more complicated and constrained physical structure. The old ways will simply not work as well as they used to. Future happens. Look at smartphone power management for a case in point.

Distributed power to follow distributed ICs

In smartphones the trend until now has been to gather power blocks all in one place — the PMIC. However, the PMIC’s monopoly is being challenged as power functions are being disintegrated and spread around in novel configurations. The reason is that physical design constraints are demanding that. So, the PMIC trust is being busted with disintegrated, specialized, and distributed PMICs, called micro-PMICs poised to take over. Micro-PMICs are already appearing in modular phone concepts from companies like ZTE and Motorola, and there will be others. This is tangible evidence that should get the attention of doubters. While these examples are still somewhat exotic and experimental concepts, they do point to a more distributed, decentralized architecture well suited for tomorrow’s wearable, mobile, and remote sensing products. It is probably a good time to abandon old-school biases and pick up the signals of the future.

Bill Boldt, Market Research Guy

lang: en_US


A Brief History of Kandou Bus

A Brief History of Kandou Bus
by Daniel Nenni on 02-23-2014 at 10:45 am

Kandou Logo

Kandou Bus uses a novel form of spatial coding to transmit data between wired chips. The main idea is to introduce correlations between the signals sent on the interface, and choose the correlations judiciously to lower the power consumption, increase the speed, and lower the footprint. It is a generalization of differential signaling (which sends correlated signals on two wires). The company is a spinoff of Dr. Shokrollahi’s lab at the Swiss Federal Technical Institute in Lausanne.

The company originated in a casual conversation about using channel coding to improve the throughput of DSL lines. The conversation was abound with “differential signals over twisted pairs.” “What is differential signaling?” asked Dr. Shokrollahi whose background is in mathematics, algorithm design, and channel coding than rather than in electronics. Once he heard what it was, he asked “so, how many wires do you use to send, say, 10 bits?” When he heard that the solution would be to use one pair for every bit to send, he immediately saw the inefficiency of the system and used his coding background to come up with a new solution in which signals were “smeared” across multiple wires. Thus, the idea of “Chord Signaling” and the company Kandou Bus was born! The name of the company is the Farsi word for beehive. Just as in a beehive where the hive’s output relies on collaboration between the bees, the superior properties of chord signaling are obtained through collaboration between the signals on the wires.

Over the course of the following 10 months, Dr. Shokrollahi assembled a team consisting of electronics and communication engineers and developed the new theory of chord signaling, and produced the first proofs of concept. The team focused on a typical mobile memory link, and taped out the very first instantiation of chord signaling capable of transmitting and receiving signals at 6.25 Gbps per wire over 10 cm of PCB trace. In this instantiation 8 bits were dispersed over 8 wires, but in a way as to make the transmission as resistant as differential signaling (which would require 16 wires). Soon after the fab out of that prototype, the team started developing a second full transceiver capable of transmitting and receiving up to 16 Gbps per wire over a challenging channel. The results of this chip have been presented at ISSCC 2014.

In the meantime, Kandou Bus has secured a Series A financing round, and has developed a product strategy to attack applications as varied as high speed networking, memory links, short chip-to-chip links via interposers, and low-power, high speed communication over TSV’s. The company is also active in the OIF-CEI and IEEE-802.3bj standards bodies and is proposing one of its technologies for solutions to the interconnect problems of various industries.

About Kandou Bus S.A.
Headquartered in Lausanne, Switzerland and founded in 2011, Kandou Bus is an innovative interface technology company specializing in the the invention, design, license and implementation of unmatched chip-to-chip link solutions. Kandou’s Chord™ technology lowers power consumption and improves overall performance of semiconductors, unlocking new capabilities in electronic devices and systems. http://www.kandou.com.


SEMICON China Shanghai 上海

SEMICON China Shanghai 上海
by Paul McLellan on 02-23-2014 at 10:30 am

SEMICON is not just the event in San Francisco every July, there are other SEMICONs around the world. Coming up next, Shanghai China. In fact there are four colocated events:

  • SEMICON China 2014, March 18th-20th
  • The 8th PV fab managers’ forum, March 17th-18th (all things PhotoVoltaic)
  • FPD China 2014, March 18th-20th (all things Flat Panel Display)
  • Solarcon China 2014, March 18th-20th (that would be solar)

All events are in the Shanghai New International Exhibition Center (SNIEC). There are all sorts of theme pavilions for technologies such as LED, TSV, Smart Life, MEMS and more.

Keynotes are from:

  • Tetsuro (Terry) Higashi of Tokyo Electron (SolarCon keynote)
  • Dr Tzu-Yin Qiu, CEO of SMIC
  • Dr Walden Rhines, CEO of Mentor. Expect lots of data from Wally, it’s his thing
  • Bill McLean of IC Insights (FPD keynote)

The keynotes are all in the Kerry Hotel, Pudong, Shanghai in the Shanghai Ballroom from 1-6pm on Tuesday March 18th. Simultaneous translation in English and Chinese will be provided.

Coming up next in the SEMICON world tour, SEMICON Singapore in the Marina Bay Sands Hotel on April 23rd-25th. Last time I was in Singapore I got a cheap room there on hotels.com. If you visit you have to go see the swimming pool on the top floor. This is the hotel that has become an iconic building in Singapore with three towers and a long swimming pool, bar, beach that stretches like a boat across all three. I’m as baffled about how they built it as I’m sure civil engineers are about how we make chips.

The focus of SEMICON Singapore is Enabling Mobility for IoT with Advanced Semiconductor Technology Innovations: A Southeast Asia Perspective. If you are part of the highly influential Southeast Asia microelectronics manufacturing ecosystem involved in semiconductors, LED, MEMs, printed/flexible electronics and other adjacent markets—this is a must-attend event.

A few more more for your diary:

  • SEMICON Russia May 14th-15th in Moscow. I didn’t even know semiconductors were a thing in Russia
  • SEMICON West, July 8th-10th in San Francisco as always. I’ll see you there.
  • SEMICON Taiwan, September 3rd-5th, Taipei
  • SEMICON Europa, October 7th-9th in Grenoble
  • SEMICON Japan, December 3rd-5th in Tokyo

Details for Shanghai (in Chinese and English) here. Details for Singapore here. The SEMI page for all the other conferences is here.


More articles by Paul McLellan…