J38701 CadenceTECHTALK Automotive Design Banner 800x100 (1)
WP_Term Object
(
    [term_id] => 3611
    [name] => IoT
    [slug] => iot-internet-of-things
    [term_group] => 0
    [term_taxonomy_id] => 3611
    [taxonomy] => category
    [description] => Internet of Things
    [parent] => 0
    [count] => 549
    [filter] => raw
    [cat_ID] => 3611
    [category_count] => 549
    [category_description] => Internet of Things
    [cat_name] => IoT
    [category_nicename] => iot-internet-of-things
    [category_parent] => 0
)

5 Things Chipmakers Are Missing on the IoT

5 Things Chipmakers Are Missing on the IoT
by Don Dingee on 06-07-2015 at 7:00 pm

When the RISC movement surfaced in 1982, researchers analyzed UNIX to discover what instructions multi-user code was actually using, and then designed an instruction set and execution pipeline to do that better. Fewer instructions meant fewer transistors, which led to less power consumption – although in the original Berkeley RISC disclosure, the word “watts” never appears. Even during the early development of ARM, lower power consumption was completely serendipitous.

As the mobile SoC began gathering momentum in 1992, the benefits of fewer transistors, smaller dies, and less power were obvious. New developments were necessary. Low power DSP capability, whether through hardware multipliers and SIMD enhancements, or efficient DSP cores, was a must for GSM signal processing. Code density expansion was a BOM killer, giving rise to the ARM Thumb instruction set. Efficient Java execution gave rise to ARM Jazelle.

In 2002, smartphone efforts ramped up. Faster processors such as ARM11 appeared. Graphics needed to improve, leading to development of mobile GPU cores such as the Imagination PowerVR MBX Lite, and later development of OpenGL ES. Operating systems started to change, indicating a shift from Symbian, Palm, and Microsoft to newer ideas. Android was just a twinkle in Andy Rubin’s eye, and Apple playing with the beginnings of Project Purple and multi-touch.

Each of these phases blew up everything we thought we knew about chipmaking. Running in parallel was a constant push for more, smaller transistors, driven by the economics of the PC and later by consumer devices. This lead to bigger wafers and smaller geometries and FinFETs and FD-SOI and gigantic FPGAs and manycore processors.

It’s 2015, and the Internet of Things is here. We should be talking about a fundamental shift in the way chips are designed and made specifically for the IoT – but, we’re not, because it hasn’t really happened yet.

True, we have dozens of microcontroller architectures and billions of chips out there. These were designed to put intelligence on a point. Control some buttons. Light some LEDs. Spin a motor. Read a sensor. Automotive and industrial types discovered they could be put on a simple bus, like CAN. Some really brave folks started putting radios on chip, like 802.15.4 or ISM band, and protocol stacks like ZigBee and Bluetooth and Thread found homes. That lead to substantial IoT progress from the likes of Atmel, Microchip, NXP (nee Freescale), Silicon Labs, TI and others – but not a breakthrough of the likes we saw in earlier phases, at least so far.

DAC52 is offering a Management Day session on June 9th to discuss “big data” in two sessions, one from the perspective of behavioral analysis and design closure in EDA, and the other from possible trade-offs in connectivity. At least we are talking. We know we have way too many connectivity standards, and not enough data-level interoperability, when it comes to the IoT.


But we still don’t have the right chips, or the right discussion. What we have is what I call “the IoT paint job”, where everyone lists IoT on their website and booth signage to draw traffic. Just watch how many press releases come out of DAC with the term IoT somewhere. Not to disparage anyone in particular, there is some good stuff happening, and there’s some fluff. ARM is making great strides with a focus on the IoT. Mentor understands embedded software versus SoC design, and Synopsys has its ARC core and virtual prototyping.

What I’m saying is we need more actual IoT progress. At least 5 things are missing:1) Processes. Somewhere in between 14nm FinFET and 130nm BCD lies a sweet spot for the IoT. We know mixed signal and embedded flash get difficult below 28nm. MEMS also presents some challenges in economics. Talk of trillions of chips and 2 cent parts makes most chip firms yawn – it hasn’t happened, and frankly isn’t a sustainable model for most companies, especially the ones tied up at the 14nm end of the spectrum needing bigger ASPs to offset billions of cap ex dollars. Where is a true, dedicated IoT process, that can handle both the technology and the business model? (Hint: ARM announced 55ULP initiatives with TSMC and UMC recently.)

2) Subthreshold. The MCU firms all understand ultra-low power, and are fast to point out metrics like uA/MHz and various modes from catnapping to comatose. Super. Business as usual, hasn’t changed much since the 1980s except the power figures have gotten smaller. The fundamental change that has to happen is subthreshold logic, or something akin to it, that redefines the equation. Companies like Ambiq and PsiKick are out there. Sunrise Micro Devices, incubated by ARM and recently reacquired, is now the technology inside the ARM Cordio radio.

3) Mixed signal. I cut my teeth making drones fly (we didn’t call them that then, they were RPVs) with a lot of LM148s and Siliconix analog switches way back when. Mixed signal is near and dear to my heart. We integrated mixed signal on MCUs, great. I get to choose from a thousand parts using a parametric search hoping I can find the exact combination of resolution, channels, and pinouts I need. There is Cypress PSoC, and Triad’s VCA, and a MAX11300 from Maxim Integrated, and not much more in configurable mixed signal. The counter argument is just put a dedicated IP block on an dedicated SoC design, and that works if you have a few million dollars. When mixed signal gets as easy to create with as CPLDs, we’ll have something.

4) Optimization. If all the rage in server design is workload optimized processors, why isn’t that true for the IoT? A lot the focus on the IoT is on one tier: the edge. But there is so much opportunity to optimize at the gateway and infrastructure levels. Network-on-chip is a big help in making MCU architecture more SoC-like. We need to start looking at IoT traffic not as a bunch of packets, but in thread form, and figure out what makes it go faster. “Meh, IoT is low bandwidth.” I hear that all the time and for a particular sensor at the edge that may be true – but toss 10,000 sensors together in real time with predictive analytics engaged and tell me how bandwidth looks then. It worked for RISC, workload optimization is needed for IoT parts.

5) Programming. ARM is rallying around their vision, mBed OS, with optimized Cortex-M IP. Check. How about optimizing for Google Brillo? Or maybe something that runs MQTT or DDS better? This may be the biggest opportunity yet, really understanding IoT software. Another change that chipmakers need to be aware of: not everything is C, or Java. Those are two of the most popular languages in the world. C was especially great when we worked with Unix and programmed hardware down to the bit level. On the IoT, many other languages are emerging (and yes, some are on top of C). Coders today are learning in Python – embedded purists need to stop barfing on it as interpreted. For distributed data analysis there is Lua. For safe, concurrent threads, there is Rust, which has just put out its first stable release. It’s a new world, and the C compiler and debugger isn’t the only vehicle anymore, or even the right one.

We’re still working very much on the old chip technology base when it comes to IoT design. When Steve Jobs introduced the iPhone in 2007, he quoted Alan Kay: “People who are really serious about software should make their own hardware.” We saw what Apple did, making its own chips to run its own software better.

Well, the IoT is all about software. It’s time we make chips just for it.

Share this post via:

Comments

0 Replies to “5 Things Chipmakers Are Missing on the IoT”

You must register or log in to view/post comments.