Over the past year, most discussions around AI infrastructure have focused on GPUs, advanced packaging, and semiconductor supply chains. Yet growing evidence suggests the real bottleneck lies elsewhere: power infrastructure.
Specifically, ultra-high-voltage transformers. Lead times have stretched to three or four years, upstream components remain in chronic shortage, and even billion-dollar projects cannot easily accelerate delivery. Taiwanese manufacturer Fortune Electric has quietly become a key supplier to some of the largest AI data center projects in the United States, including Stargate, backed by OpenAI, SoftBank, and Oracle.
From Elon Musk’s xAI to Mark Zuckerberg’s push for “personal superintelligence,” the future of AI may depend less on who controls the most GPUs—and more on who can secure enough electricity to keep them running.
cwnewsroom.substack.com
Specifically, ultra-high-voltage transformers. Lead times have stretched to three or four years, upstream components remain in chronic shortage, and even billion-dollar projects cannot easily accelerate delivery. Taiwanese manufacturer Fortune Electric has quietly become a key supplier to some of the largest AI data center projects in the United States, including Stargate, backed by OpenAI, SoftBank, and Oracle.
From Elon Musk’s xAI to Mark Zuckerberg’s push for “personal superintelligence,” the future of AI may depend less on who controls the most GPUs—and more on who can secure enough electricity to keep them running.
Not chips—transformers are the Taiwan-made components AI data centers need most. Even Musk and Zuckerberg can’t get around them.
Liang-rong Chen
