WP_Term Object
(
    [term_id] => 13
    [name] => Arm
    [slug] => arm
    [term_group] => 0
    [term_taxonomy_id] => 13
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 389
    [filter] => raw
    [cat_ID] => 13
    [category_count] => 389
    [category_description] => 
    [cat_name] => Arm
    [category_nicename] => arm
    [category_parent] => 178
)
            
Mobile Unleashed Banner SemiWiki
WP_Term Object
(
    [term_id] => 13
    [name] => Arm
    [slug] => arm
    [term_group] => 0
    [term_taxonomy_id] => 13
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 389
    [filter] => raw
    [cat_ID] => 13
    [category_count] => 389
    [category_description] => 
    [cat_name] => Arm
    [category_nicename] => arm
    [category_parent] => 178
)

Could “Less than Moore” be better to support Mobile segment explosion?

Could “Less than Moore” be better to support Mobile segment explosion?
by Eric Esteve on 02-05-2013 at 4:52 am

If you take a look at the explosion of the Mobile segment, linked with the fantastic world-wide adoption of smartphone and media tablet, you clearly see that the SC industry evolution during –at least- the next five years will be intimately correlated with the mobile segments. Not really a surprise, but the question I would like to raise is: “will this explosion of SC revenue in mobile segment will only be supported be applying Moore’ law (race for integration, more and more functionalities in a single chip targeting the most advanced technology nodes), or could we imagine that some subsequent mobile sub-segment could be served by less integrated solution, offering much faster TTM and finally better business model and more profit?”

At first, let’s identify today’s market drivers, compared with these of the early 2000’s. At that time, the PC segment was still the largest, we were living in a monopolistic situation and Intel was benefiting from Moore’s law, technology node after technology node. If you schematize, the law says that you can choose between dividing the price by 2 (for the same complexity) or doubling the complexity (for the same price), and in both cases increase the frequency. Intel clearly decided to increase the complexity, keep the same die size… and certainly not decrease the price! TTM was not really an issue for two reasons: Intel was (and still is) the technology leader, always the first to support the most advanced node, and the company was in a quasi-monopolistic situation.

The explosion of mobile has changed the deal: performance is still a key factor, but the most important is power consumption (say MIPS per Watt). Price has becoming a much more important factor, even if performance is still key, on such a competitive market, with more than 10 companies addressing the Application Processor market. And TTM is also becoming a key factor on such a market.
To summarize, we move from a PC market where (Performance, somewhat TTM) are the key factors to a Mobile market where (MIPS per Watt, Price, TTM) are keys. Unfortunately, I don’t think that following Moore’s law in a straight way can efficiently address these three parameters…

  • Leakage currentis becoming such an important, as highlighted in this article from Daniel Payne, that going forward will help increasing the CPU/Logic performance, but may decrease at the end the power efficiency (MIPS per Watt)! This has forced design teams to use power management techniques, but the induced complexity has a great impact on IC development and validation lead-time…
  • Price: we talk about IC Average Selling Price (ASP), the chip maker think in term of “cost”, at first, then ASP when they sell the IC. There are two main factors affecting this cost: IC development cost (Design resources, EDA, IP budget, Masks, Validation…) and chip fabrication cost in production (Wafer, Test, Packaging). If you target an IC ASP in the $10/20 range, like for example an Application Processor for smartphone, you quickly realize that, if your development cost is in the $80 million range (for a chip in 20nm), you must sell… quite a lot of chips! More likely, the break-even point is around 40 or 50 million chip sold!
  • Time-To-Market(TTM): once again, we are discovering that, for each new technology node, the time between the design start and the release to production (RTP, when you start to get return on your investment) is longer and longer. It would take a complete article to list all the reasons, going from Engineering gap to longer validation lead-time, passing by wafer fab increased lead-time, but you can trust me: strictly following Moore’s law directly induces a longer overall lead-time to RTP!

Does that mean that Qualcomm, for example, is wrong when proposing the above 3 step road-map, ending with a single chip solution? Certainly not… but they are Qualcomm, the emerging leader (and for long time in my opinion) in the mobile segment, offering the best technical solution, with a TTM advantage. But, if you remember, more than two Billion systems will ship in the mobile segment by 2016, which means about 20 billion IC… We can guess that Qualcomm will NOT serve all the mobile sub-segments, thanks for the competition able to enjoy some good piece of business! This article is addressing Qualcomm (and Samsung or even Apple) followers: “Less than Moore” could be a good strategy too! I realize that it will take another post to describe which could be the possible strategies linked to “Less than Moore”, so, please be patient, I will release this in a later article…

From Eric Esteve from IPNEST

Share this post via:

Comments

0 Replies to “Could “Less than Moore” be better to support Mobile segment explosion?”

You must register or log in to view/post comments.