Let me precise that by “IoT” I think about the IoT devices market, made of hundreds of application, wearable gadget to medical, home automation, and so on. One direct consequence of IoT (device) market explosion will be the strong growth of the server market (cloud), to transfer, compute and store information generated by the billions of IoT devices. Intel is certainly the chip manufacturer being the best positioned to enjoy this server market explosion. But this is not the topic today. I clearly refer to the IoT device market, where system cost and power consumption are the first key parameters, along with security, wireless communication efficiency, etc. but let’s focus on cost and power. This post is also an answer to this article from Seeking Alpha: “Intel Is Mispriced And Positioned For IoT Growth”, written by a person probably very good when talking about the stock exchange market, but missing proper understanding of the semiconductor industry, especially when dealing with low power consumption and design for optimum (low cost) SoC integration…
Another useful precision: when I use “never”, I think “within 5 years”. If you look at the recent semiconductor history, it took 5 years to Texas Instruments (between 1999 and 2004) to enjoy revenues from wireless application processors passing from less than $1 billion to more than $5 billion. It also took 5 years to Qualcomm to kick out of this wireless business companies like TI, ST or Broadcom (between 2005 and 2010). In our industry, 5 years is equivalent to the infinite and when dealing with IoT, 5 years means in 2020…
I see three reasons why Intel will not be ready by 2020 to generate high revenue and high profit on IoT devices:
- Intel’s wafer fab and Si technology are built for high performance process (and processor) rather than for low power process.
- Intel design culture is directed to build high performance devices (server processors for example) not for designing for power efficiency (like can be Qualcomm).
- Intel top management has grown in this “always higher performance” culture and managed the whole company on line with this culture.
The first two can’t be changed without first changing the last one. I don’t say that Intel top guys should be fired… but they will have to revolution their way of thinking to admit that power efficiency is much better than pure performance, at least if Intel want to be successful in IoT devices market (and in wireless mobile by the way). Let’s imagine that this revolution occurs and that Intel hire efficient managers with the right culture, like people who have been successful in developing application processors for smartphone application.
I am sure that Intel could find ex-Tiers or former Broadcom or ST employees with the right profile. How long will it take 1) to select and hire this people and 2) to give them the right level of power so they can effectively influence the process peoples? Intel is not really a start-up, rather a kind of mammoth, so 1 year is the very minimum.
Let’s assume that the right set of decision are taken to build new processes (I did not say new advanced process like 10nm, rather process based on 55nm or maybe 40 or 28nm), no more for pure performance but for power efficiency. Developing and proving these processes would probably take another 18 to 24 months (don’t forget that low power design is not in Intel culture, so Intel will have to come to a point it took 10 years for TSMC or TI to come to, process node after process node). Say 2 years.
Then, in parallel Intel will have to build completely new design teams (my opinion), managed by new gurus who know how to design a complete SoC for low power. Clock gating, power islands and probably other techniques will have to be used. At this point of time, a cleaver designer will say: Ooops! We don’t have the right processor core! Let’s buy a RISC (probably not ARM, but who knows…). Back in 2009, when TI was completing OMAP5 validation, I had numerous discussions with one of the TI program manager. I remember very well that he told me that OMAP5 validation was extremely complex, due to the low power techniques. If I remember well, it took more than one year after the prototype release to simply validate the chip. How long could be a complete design phase, starting with a new RISC core? I would say: 18 to 24 months (RTL design + P&R) + 3 months (prototype fabed) + 12 months for prototype validation and S/W integration. 33 to 39 months, say 3 years.
That makes a minimum of 4 years (1 + 3) before Intel can release a low cost, low power (new) RISC based SoC, running on Intel fab (don’t forget that the goal of an IDM like Intel is to fill own wafer fab, right?). Obviously, using ARM based SoC targeting TSMC (or Samsung) would be much more time effective, but does it really makes sense, from a business point of view?
Last point, I think I understand why these stock market analysts are confused. They know that Intel is good at embedded. In this case we talk about embedded x86 integrated into boards, like on this picture:
…when an IoT device will rather look like this system on board (WiPy from TI, but the web is full of similar systems):
Why such a confusion? Is it because an IoT system is by definition based on an embedded processor (here an ARM Cortex M4 core)? But an embedded system board from Intel has nothing to see with an IoT system, it’s just high performance, but high power processor mounted on a board, needing an expensive cooling system (see the picture). Such a board is not: cost effective (the cooling system cost is probably equivalent to the complete IoT solution from TI!) and even more crucial, is not power effective. IoT and more precisely wearable IoT should last at least a week if not a month between charging.
Maybe in 2020 Intel will be in a position to launch such systems, but many barriers will have to be broken in the meantime…