Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/elon-musk-predicts-xai-alone-will-buy-%E2%80%98billions%E2%80%99-of-ai-chips-costing-as-much-as-25-trillion-with-50-million-chips-coming-within-%E2%80%985-years%E2%80%99.23476/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Elon Musk Predicts xAI Alone Will Buy ‘Billions’ of AI Chips Costing As Much as $25 Trillion, With 50 Million Chips Coming Within ‘5 Years’

Daniel Nenni

Admin
Staff member
Elon Musk Predicts xAI Alone Will Buy ‘Billions’ of AI Chips Costing As Much as $25 Trillion, With 50 Million Chips Coming Within ‘5 Years’

Elon Musk Predicts xAI Alone Will Buy ‘Billions’ of AI Chips Costing As Much as $25 Trillion, With 50 Million Chips Coming Within ‘5 Years’© Provided by Barchart

Elon Musk argues that AI progress will be determined chiefly by raw compute capacity and the power to run it. He frames the next five years as a march toward roughly 50 million “H100-equivalent” accelerators, a normalized yardstick for aggregate compute across evolving chips and vendors, with a longer-term path to billions as AI permeates devices and industry.

That scale presumes synchronized growth in fast interconnects, advanced packaging, high-bandwidth memory, cooling, and dependable grid power, because training and inference workloads in LLMs, autonomy, and robotics continue to rise in size and scope. Using today’s pricing ($25k–$40k per H100), a literal buildout implies hardware outlays from hundreds of billions to low trillions over five years, though real spend should be lower as future chips deliver more performance per dollar.

Musk’s view reflects his operations across xAI, Tesla, and SpaceX, where chip procurement, datacenter buildouts, and power constraints are lived realities. The market implications favor semiconductor, memory, networking, packaging, datacenter/REIT, and utility players, while bottlenecks in fabrication, components, or electricity, plus policy choices like export controls and manufacturing incentives, could stretch timelines and returns. Bottom line: AI leadership will track who can assemble, at scale, the compute and power to run it.

 
So calculating in the dissociative “medication” to this latest delusion, xAI will buy a couple hundred thousand of chips worth some double digit billions in revenue.

Cool. Next?
 
Back
Top