Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/fusion-needed-for-ai.19905/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Fusion needed for AI

milesgehm

Active member
Sam Altman saying stuff

Altman himself has invested hundreds of millions in fusion and in recent interviews has suggested the futuristic technology, widely seen as the holy grail of clean energy, will eventually provide the enormous amounts of power demanded by next-gen AI.

“There’s no way to get there without a breakthrough, we need fusion,” alongside scaling up other renewable energy sources, Altman said in a January interview. Then in March, when podcaster and computer scientist Lex Fridman asked how to solve AI’s “energy puzzle,” Altman again pointed to fusion.

 
Sam Altman has invested hundreds of millions of dollars in nuclear fusion? And it is likely decades away? Is this a ChatGPT generated article?

"Altman himself has invested hundreds of millions in fusion and in recent interviews has suggested the futuristic technology, widely seen as the holy grail of clean energy, will eventually provide the enormous amounts of power demanded by next-gen AI."

"Nuclear fusion — the process that powers the sun and other stars — is likely still decades away from being mastered and commercialized on Earth. For some experts, Altman’s emphasis on a future energy breakthrough is illustrative of a wider failure of the AI industry to answer the question of how they are going to satiate AI’s soaring energy needs in the near-term."
 

Kyle Corbitt continued to ask, "Why not concentrate the training clusters in the same area?" The other party (Microsoft engineer) responded that they had tried it, but they could not place more than 100,000 NVIDIA H100 GPUs in a single state, otherwise the power grid would be paralyzed.
 
Back
Top