It used to be that if you went to a processor conference, you could expect to spend hours listening to talks about pipelining, cache schemes and processor architecture. Well, I went to the Linley Processor Conference this week in Santa Clara and found the topics pretty compelling. Processors are in just about everything. It is easier to ask what does not contain a processor. So this conference was, in many ways, about just about everything. Chief among the topics was automotive, mobile, networking, IoT, consumer and enterprise.
The keynote was given by Linley Gwennap, principal analyst and founder of the Linley Group. His talk was titled Processor Technology and Market Trends. He covered general embedded trends, processor IP, networking, IoT and advanced automotive. The presentations over the two days drilled into all of these areas.
In 2014 Intel had the lion’s share of the embedded market. This comes about from them leveraging the PC ecosystem for things like ATM’s, signage and other appliance type applications. Next in market share is Freescale who has a strong lead in the comms sector. Next, each with smaller shares are Broadcom, Cavium, AMD, LSI, Marvell and AppliedMicro.
If you have been following the news, you already recognize many of the above companies from the business pages. There is a wave of consolidation. NXP acquires Freescale, Avago acquires LSI and Broadcom. Look for the ripple effects from these changes.
Consumer, IoT and mobile/wearable all require cost, power and footprint reduction. The most effective way to accomplish this is through integration. There is an increase in SOC complexity. During the conference, many examples of processor based SOC’s had many other functional blocks on board, and also in some cases many processors – each targeted at a specific sub function.
Designing these SOC’s requires a deep understanding of application needs, so they can operate most efficiently. The right processors need to be used, along with specific IP and the software to drive the whole system. Complexity is rising.
Linley sees the benefits of Moore’s law only applying to companies with the money to take advantage of it. The increases in mask costs due to double patterning is contributing to this. 28nm is a hinge node, where for the first time if you go to a smaller node you will pay more per transistor. This is forcing costs sensitive products to stay on 28nm.
There is real movement in the high end embedded space to ARMv8. AMD, AppliedMicro, Cavium, Freescale are already shipping ARMv8 cores. Broadcom and Marvell are in development. This is a big push into the 64bit ARM architecture for applications that need the horsepower.
With smartphones containing 2 to 4 chips with CPU IP, this segment drives the most shipments. However, the fastest growing segment is embedded, growing at 29% in 2014. Most of this is MCU’s. In 2014 CPU IP was used in 15.3 billion chips. Two thirds of which were mobile and embedded.
Heterogeneous processors are now commonly combined onto one chip. This enables each processor type to be optimized for specific tasks. Listening to music might be done with a DSP, email can be handled with a small, slow CPU; but video will be shifted to heavier duty processors that consume more power. Turning off unneeded processors can dramatically increase battery life. This trend is increasing for other reasons. Physically partitioning tasks offers greater security as well.
Automotive applications were a big topic for the entire conference. As cars add safety and convenience systems a large need for processors is developing. Some notable applications include adaptive cruise control, which can maintain a safe following distance to the car ahead. Drowsiness detection and lane detection are two other significant safety systems that will require significant processing power. The big prize of course is the self driving car. Linley expects fully autonomous technology to be available as an adder of less than $10K by 2022.
The talk and the conference showed just how much technology is moving to adapt to our needs. During the conference I realized that the idea I had in my mind about what my first robot would look like was wrong. Of course I imagined something that could ‘see’ its surroundings through sensory input and respond to them. I also imagined that it might have a neural network, and processes information much like a human brain. It would have awareness of its location and be able to move from place to place. What I did not realize is that I would sit inside of it – and that it would probably be a car.Share this post via: