Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/microsoft-cto-says-he-wants-to-swap-most-amd-and-nvidia-gpus-for-homemade-chips.23735/page-2
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030770
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Microsoft CTO says he wants to swap most AMD and Nvidia GPUs for homemade chips

Intel’s NPU is not a standalone product; rather, it is a component of Intel’s Core Ultra processors. It's designed to support Intel's AI PC offerings. Intel has high hopes for AI PCs, but the market has not yet shown strong demand yet.
The good thing about Movidius was that one didn't need a high-performance computer to run AI applications and many people did use "raspberry pi" with Movidius or just simple use any laptop or computer which had an USB port. Intel wanted to basically eliminate the little guy using AI by introducing very expensive computers with a CPU capable of doing what a neural stick could do it but sold for $78. Basically, restricting lots of people trying ideas on their own with little expense and I think that was a bad idea.
 
I visited Microsoft many years ago as an hardware engineer and actually talked with a chip designer in his office. Microsoft (like other big companies I had experience with) is a closed office culture with an atmosphere of government cold culture. I refused to work there because I sensed that Microsoft has no business in developing chips and barely made it on hardware products. Microsoft had the tablet concept developed many years before they gave it away to Apple because they were not capable to produce something to sell. But eventually Microsoft produced the surface laptop many years after they gave the technology to Apple which produce the tablets form of Microsoft idea. Microsoft has exploited Intel for decades for their own development and I remember while working at Intel having a weekly meeting in which the top logo of the slides was: "Intel is in Microsoft jail". Also, on the AI, Microsoft had tried to squeeze as much cooperation with Intel in the graphics accelerator development with the same strategy to entrap Intel in their Microsoft jail. In my opinion, having worked in chip development domain, Microsoft should stick with software unless it wants to repeat what Intel did trying to develop graphics processors and wasted billions of dollars.
Your view of Microsoft chip development is years out of date. The current group is managed by Rani Borkar, who was previously a chip development CVP at Intel. I know she took a bunch of people with her when she moved to Microsoft. Rani was just promoted to President, Azure Systems & Infrastructure, as Nadella narrows his scope to focus on AI technical innovation. I think it's a different world than when you interacted with Microsoft.
I worked in the Graphics Accelerator (Movidius & MyriadX) group at Intel (part of the AIPG - artificial intel. platform group) and the product was very successful (the demand was beyond any expectation) so the design was expanded to other usage (ex: for datacenter applications).
Which datacenter applications?
 
Your view of Microsoft chip development is years out of date. The current group is managed by Rani Borkar, who was previously a chip development CVP at Intel. I know she took a bunch of people with her when she moved to Microsoft. Rani was just promoted to President, Azure Systems & Infrastructure, as Nadella narrows his scope to focus on AI technical innovation. I think it's a different world than when you interacted with Microsoft.

Which datacenter applications?
I don't know but I remember being told at the time that they started building boards with 8 Movidius chips for servers and they were testing in the idea to build a datacenter in JonesFarm campus. I did put together a whole bunch of rasberry pi for some sales presentations but in the end they decided to discontinue Movidius and make a better MiriadX. I could attach here some pictures from my work on Movidius and MiriadX just for curiosity of some people but apparently linkedin complained about file size so here is only one showing Movidius side by side with MiriadX board (first version).
 

Attachments

  • Movidius_Myriadx.jpg
    Movidius_Myriadx.jpg
    10.4 KB · Views: 10
How do these Microsoft, Google, Amazon, Meta ASICs get done? They do the architecture/mircroarchitecture/front end in house, and then the Broadcoms and Marvells provide the IPs and backend service? Or do they (Broadcom/Marvell) get to participate earlier up stream?
 
How do these Microsoft, Google, Amazon, Meta ASICs get done? They do the architecture/mircroarchitecture/front end in house, and then the Broadcoms and Marvells provide the IPs and backend service?
Exactly. The backend shop is also chosen by IP they have, like for networking and PHYs.
Or do they (Broadcom/Marvell) get to participate earlier up stream?
I suspect there are blocks they do contribute or participate in the design. I doubt it for AWS though. In fact, I have suspected AWS (Annapurna) may have backend capabilities too, but I don't know for sure.
 
How do these Microsoft, Google, Amazon, Meta ASICs get done? They do the architecture/mircroarchitecture/front end in house, and then the Broadcoms and Marvells provide the IPs and backend service? Or do they (Broadcom/Marvell) get to participate earlier up stream?

Google started with Avago but they now do their chips in-house. Meta uses ASIC companies, Arm is doing a chip with them now. Amazon does their own chips through what was Annapurna Labs. This is pretty well known inside the ecosystem. This is the same way Apple did it. They used an ASIC service then acquired the capabilities in-house. Some of these companies use commercial IP but bring that in-house as well.

The big benefit of doing your own chip is that you can bring up software during the design process using emulation, and prototyping. Apple is a huge emulator customer as is Nvidia, Intel, AMD, and whoever else has a big software stack.

An ASIC company can do the whole thing or parts. The problem with being an ASIC company is that you are literally training your replacement during the design process.
 
Back
Top