Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/tsmc%E2%80%99s-next-gen-cowos-hits-like-a-death-note-for-delta-and-infineon.22972/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

TSMC’s Next-Gen CoWoS Hits Like a Death Note for Delta and Infineon

karin623

New member
Bernstein Securities predicts that with the explosion in demand for CoWoS used in AI chips, TSMC’s advanced packaging revenue is expected to make up 10% of its total this year, surpassing ASE(日月光) to become the world’s largest packaging provider.

Last month’s TSMC 2025 Technology Symposium revealed the next evolution of CoWoS technology for the first time, which will initiate a new technological revolution and intense cross-industry competition.

Dr. Yi-Jen Chan, former director of ITRI’s The Electronic and Optoelectronic System Research Laboratories and CTO of Cyntec issued a warning in his personal column: TSMC is already a “one-man show” in advanced processes, and “in the near future, it may also stand alone in advanced packaging.”

In other words, by leveraging its monopolistic advantage in CoWoS technology, which is essential for all AI chips, TSMC can integrate more and more functions, such as this integrated voltage regulator.

To power module suppliers such as Delta and Infineon, the “Future CoWoS” slide might as well have been a Death Note, a chilling sign that their standalone products are being written out of existence by integration into CoWoS.

“We hope TSMC leaves us some business,” a power industry insider said helplessly. “If you do everything, what's left for us?”

 
To power module suppliers such as Delta and Infineon, the “Future CoWoS” slide might as well have been a Death Note, a chilling sign that their standalone products are being written out of existence by integration into CoWoS.
My view is that the customer “Why?” and first usage is more interesting than other effects on the industry. That’s a place where Intel might have the chops to shine as well. It’s all about super-integration of AI data center racks.

Three Challenges of the Integrated Voltage Regulator (IVR)

The IVR comprises three parts: an Nvidia-designed power management IC produced with TSMC’s 16nm process, combined with ultra-thin capacitors and inductors. Embedding these into a silicon interposer only 100 micrometers thick is “super challenging” technically, according to the aforementioned power industry insider. He noted that even Nvidia’s own engineers are not fully confident in its feasibility. It is currently expected to be first paired with Nvidia's next-generation GPU architecture, “Feynman,” slated for release in 2028.



That said, Jensen Huang is pushing the entire supply chain forward to launch a super-efficient AI computer that can treat 72 GPUs as one computational unit. The industry may be breathless, but it’s understandable, as the AI race, driven by giants like OpenAI and Google, generates an insatiable demand for computing power.



He stated that Nvidia’s next-generation AI server racks will house 576 GPUs, with a single rack's power consumption reaching 1MW. The power demand of a single AI data center could reach 1GW, rivaling the generation capacity of an entire nuclear reactor.

However, the current power supply architecture is too inefficient. The conversion efficiency from the public power grid all the way to the chip is about 87.6%. “This means that 12.4% of the energy is lost as heat in the process.”

For the aforementioned 1MW rack, the high heat generated could simultaneously boil 124 hot pots, and even the most advanced liquid cooling technology might be helpless.

The only solution is to fundamentally overhaul the entire AI data center’s power supply architecture from the ground up. This is the 800-volt High Voltage Direct Current (HVDC) power system architecture that Nvidia is vigorously promoting.
 
Back
Top