Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/external-analysis-ai-will-require-2t-in-annual-revenue-to-support-500b-in-planned-capex.23676/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

[External] Analysis: AI will require $2T in annual revenue to support $500B in planned CapEx

Xebec

Well-known member
Note: I am not familiar with this agency, but I thought the analysis of future shortages in revenue for AI firms, and continued impact to the Semi supply chain would be an interesting topic for this forum.

The article implies that significant breakthroughs in energy efficiency for AI will also be required to support currently projected demand and planned CapEx spends in this space.


Take-aways:
  • - AI’s computational needs are growing more than twice as fast as Moore’s law, pushing toward 100 gigawatts of new demand in the US by 2030.
  • - Meeting this demand could require $500 billion in annual spending on new data centers.
The economics become unaffordable. Bain’s research suggests that building the data centers with the computing power needed to meet that anticipated demand would require about $500 billion of capital investment each year, a staggering sum that far exceeds any anticipated or imagined government subsidies. This suggests that the private sector would need to generate enough new revenue to fund the power upgrade. How much is that? Bain’s analysis of sustainable ratios of capex to revenue for cloud service providers suggests that $500 billion of annual capex corresponds to $2 trillion in annual revenue.

What could fund this $2 trillion every year? If companies shifted all of their on-premise IT budgets to cloud and also reinvested the savings anticipated from applying AI in sales, marketing, customer support, and R&D (estimated at about 20% of those budgets) into capital spending on new data centers, the amount would still fall $800 billion short of the revenue needed to fund the full investment (see Figure 2).


1758646978082.png
 
I love how Bain just assumes 20% of the budget for personnel in sales, marketing, customer support, and R&D can be forked over as AI budget. Not only is head count gone, but the savings--the whole point of AI--are entirely consumed in the bonfire of 4.5x cost scaling up per year. I guess I can make the easiest prediction ever--that won't happen.

This makes China appear to be on the right path, at least with DeepSeek providing a hardware-light path forward.
 
I love how Bain just assumes 20% of the budget for personnel in sales, marketing, customer support, and R&D can be forked over as AI budget. Not only is head count gone, but the savings--the whole point of AI--are entirely consumed in the bonfire of 4.5x cost scaling up per year. I guess I can make the easiest prediction ever--that won't happen.

This makes China appear to be on the right path, at least with DeepSeek providing a hardware-light path forward.
Agree re: DeepSeek

I also get the sense that ChatGPT (and probably others) are also focused mainly on that token $/efficiency with recent releases.

This only anecdotal, but I've seen a lot of unique places sharing this experience -- GPT5 seems to be only slightly better than GPT4 overall, but also seems to vary a lot in 'smartness' from day to day.

I suspect OpenAI has more knobs to tune on the back end to keep GPT working "at speed" regardless of demand. i.e. More demand, it becomes "dumber" to respond just as quick as when there is less demand.
 
Note: I am not familiar with this agency, but I thought the analysis of future shortages in revenue for AI firms, and continued impact to the Semi supply chain would be an interesting topic for this forum.

Bain & Company is one of the top three management consulting firms. Its founder, Bill Bain, was a Boston Consulting Group (BCG) consultant assigned to Texas Instruments in the early 1970s, where he promoted the Learning Curve Theory.

At TI, Bill Ban's main point of contact was a young vice president, Morris Chang. Bill Bain left BCG in 1973 to start his own consulting firm, Bain & Company. Morris Chang applied the Learning Curve Theory at TI’s semiconductor division and, eventually, at TSMC, the company he went on to found.
 
GPUs will get replaced with dedicated AI accelerators. Inference will be the first thing to go. You already see Google do this but the trend will accelerate.

It is normal for application specific accelerators to be an order of magnitude more efficient than general purpose hardware.
 
Back
Top