Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/openai-would-have-to-spend-over-1-trillion-to-deliver-its-promised-computing-power-it-may-not-have-the-cash.23837/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030770
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

OpenAI would have to spend over $1 trillion to deliver its promised computing power. It may not have the cash.

Daniel Nenni

Admin
Staff member
Open AI Sam Altman.jpg


OpenAI (OPAI.PVT) would have to spend more than $1 trillion within the next five years to deliver the massive amount of computing power it has promised to deploy through partnerships with chipmakers Nvidia (NVDA), Broadcom (AVGO), and Advanced Micro Devices (AMD), according to Citi analysts.

OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips, which is nearly the amount of power required to provide electricity to the entire state of New York during peak summer demand.

Citi estimates that it takes $50 billion in spending on computing hardware, energy infrastructure, and data center construction to bring one gigawatt of compute capacity online.

Using that assumption, Citi analyst Chris Danely said in a note to clients this week that OpenAI's capital expenditures would hit $1.3 trillion by 2030.

OpenAI CEO Sam Altman has reportedly floated bolder promises internally. The Information reported in late September that the executive has suggested the company is looking to deploy 250 gigawatts of computing capacity by 2033, implying a cost of $12.5 trillion.

But there's no guarantee that OpenAI will have the capital to support the costs required to achieve its goals. While OpenAI's costs are set to soar to more than $1 trillion, Citi estimates the company's revenue will climb to a fraction of that figure — $163 billion — by 2030.

That disconnect has added to Wall Street concerns over a stock market bubble. Stocks have soared to new records this year largely on investor optimism over artificial intelligence.

OpenAI had already made big commitments to the global AI build-out ahead of its latest deals with chipmakers. The company in September announced a $300 billion deal with Oracle (ORCL) as part of its 10-gigawatt US AI infrastructure project called Stargate. OpenAI has unveiled additional Stargate infrastructure projects abroad in partnership with Nvidia in the United Arab Emirates and Norway. The company also committed $22 billion to purchasing data center capacity from Nvidia-backed AI data center provider CoreWeave (CRWV).

The tangled web of investments among the leading industry players has led to concerns that AI demand could be overstated.

"[OpenAI CEO Sam Altman] has the power to crash the global economy for a decade or take us all to the promised land, and right now we don't know which is in the cards," Bernstein analyst Stacy Rasgon wrote in an Oct. 6 note.

Adding to funding concerns, it's unclear whether US power infrastructure can scale up in time to meet the energy demands of the latest AI projects, which would prevent OpenAI from cashing in on its spending.

If, however, OpenAI meets its goals, chipmakers could see huge gains. Nvidia could see as much as $500 billion in revenue from OpenAI if the total deal amount is fulfilled, according to Bank of America analyst Vivek Arya. And Broadcom could see more than $100 billion in revenue from its own deal with the ChatGPT developer, Bernstein's Rasgon estimated.

 
"OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips"

Is the gigawatt the new unit of computing capacity then ? Leaving aside the dimensional analysis fail here, I'm curious exactly who dreamed up this new metric as it seems almost the opposite of what we should be optimising for. For raw performance, surely some sort of Tflops type measurement applies. But don't we really want to keep our eyes on performance/watt and look for some sort of Moore's Law type effect where we can scale up performance in the same power envelope ?
 
Chips are getting more power efficient but it's like 15% better performance per watt at every two years. Given that you can't really rely that much on efficiency to get you the amount of compute required.

People are talking about power as the metric for datacenters since that's become the main constraint. It's no longer a matter of getting the chips - there is a 4 year long (and growing) wait to get a large load connected to the grid. Datacenters are building their own power plants to get around that constraint, but it's still the major constraint. This is why you have seen such a rise in the stock of Bloom Energy, since they offer what's probably the best solution to get power quickly to a new datacenter - even if it's a bit more expensive (at least today).

I think these data centers will likely end up getting plopped on gas fields directly in Ohio and Texas and powered by fuel cells. There is basically no other way to get the power required at the speed required.
 
"OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips"

Is the gigawatt the new unit of computing capacity then ? Leaving aside the dimensional analysis fail here, I'm curious exactly who dreamed up this new metric as it seems almost the opposite of what we should be optimising for. For raw performance, surely some sort of Tflops type measurement applies. But don't we really want to keep our eyes on performance/watt and look for some sort of Moore's Law type effect where we can scale up performance in the same power envelope ?

I agree but I do like the gigawatt measurement term in regards to how much power these datacenters will require. I'm not confident we have done the math here in regards to how much electricity we will need to power all of these "announced" AI datacenters.

As it is, in Northern California we are told to not use our appliances during peak power consumption periods. We also have water issues. There are also massive high density housing growth initiatives which will make things worse.

Exciting times.........
 
Back
Top