WP_Term Object
(
    [term_id] => 18
    [name] => Intel
    [slug] => intel
    [term_group] => 0
    [term_taxonomy_id] => 18
    [taxonomy] => category
    [description] => 
    [parent] => 158
    [count] => 426
    [filter] => raw
    [cat_ID] => 18
    [category_count] => 426
    [category_description] => 
    [cat_name] => Intel
    [category_nicename] => intel
    [category_parent] => 158
)

Why is Intel going inside Altera for Servers?

Why is Intel going inside Altera for Servers?
by Eric Esteve on 06-02-2015 at 12:30 pm

You should be happy to listen that Intel will buy Altera FPGA challenger, if you expect always more power to be consumed in datacenter! In 2013 the power consumption linked with the Servers and Storage IC activity, plus the electricity consumed in the systems cooling these high performance chips has reached 91 BILLION KWh (or the equivalent of 34 500MW power plant, or $9.1 Billion electricity bill). Could we see stabilization or even decrease of this power consumption in the near future, due to Moore’s law or whatever else? No way! At first because the amount of data exchanged (and stored) in the cloud is growing by 60% per year. This is the natural evolution linked with the smartphone explosion and with our common behavior evolution: we want to capture image and sound (store it), share it by sending it through the cloud, see TV, series or movies when moving, and so on.

Why linking the Intel/Altera deal with power consumption increases in the datacenters? As a matter of fact, Intel has a lion’s share in datacenter servers, based on x86 architecture. We know that this CISC architecture has been initially designed for performance, at a time (1990’s) where the need for compute power in PC and servers was crucial. In the datacenter, the needs for ever higher compute power is asserted along with the need for certain level of flexibility in order to quickly adapt an installed system to protocol evolution or new features. On top of this need for lack flexibility it has been shown that x86 architecture is not well tailored to run search engine algorithms. In the x86 case, the only option is to write software. Designers have tried to improve efficiency by using GPU –better than x86, but not optimum. A team with Microsoft have used FPGA instead, and reported 95% improvement compared with x86. Not surprising as FPGA design offer a much better flexibility than x86.

Which was surprising was the way Wall Street has reacted to this news. Some people who clearly didn’t understood anything to high tech, in particular to the difference between software design, FPGA development and ASIC technology, thought that FPGA was the panacea for search engine algorithm development in datacenters. Not only they thought it, but they wrote it (search for: “What Intel’s Buyout Of Altera Means For The FPGA Industry”, a superb example of writing about a topic that the author absolutely don’t understand). And Wall Street has decided that Intel should buy Altera to create a synergy, shipping $2,000 FPGA consuming 50 to 100W along with their $500 server chips!

The problem is simple: the same algorithm running on a $2,000 FPGA (consuming several dozens of Watts) will run, probably faster, on a $20 ASIC consuming 5 to 10 Watt! I agree that Intel will be happier to sold $2,000 part than $20 ASIC, but is it enough to build a strategy? By the way, if you don’t trust me, just think to the secure networking chips designed by Broadcom (Netlogic), able to screen networking frames on the fly and detect viruses, the virus database being updated daily (thanks to Flash Memory).
So, is Intel buying Altera a good deal? Altera is part of the top 20 semi vendors, selling certain new products with 80% GPM, enjoying a strong customer base in Networking, Industrial, Automotive, Consumer and more, so Intel will most probably get benefit from this investment (when the interest rate is close to zero, almost any acquisition is most valuable than leaving the money at bank!). Should Intel/Altera develop synergic solution for datacenter? Not only I don’t think so,… but I hope not, at least for the planet!

From Eric Esteve from IPNEST

Share this post via:

Comments

0 Replies to “Why is Intel going inside Altera for Servers?”

You must register or log in to view/post comments.