Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/sk-hynix-sees-hbm-growing-30-annually.23383/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

SK hynix Sees HBM Growing 30% Annually

Daniel Nenni

Admin
Staff member
84ad18f35a61c8e0fda7f524feba5aa1


SK hynix says demand for high bandwidth memory used in AI can grow about 30% a year through 2030, leaning on what it calls firm end user appetite and rising cloud capex.

In an interview with Reuters, Choi Joon yong, who oversees HBM business planning, put it plainly: AI demand from the end user is pretty much very firm and strong. He added that spending by Amazon, Microsoft and Google will likely be revised higher, which would be positive for HBM. SK hynix expects the custom HBM market to reach tens of billions of dollars by 2030.

That upbeat view plays down worries that higher pricing could cool orders in a segment long treated like a commodity. The company is betting AI accelerators keep pulling HBM into both training and inference, supporting volume, mix and long term visibility.

A 30% growth path signals durable demand and potential pricing power for leading AI memory suppliers. Moreover, SK hynix sees a long runway for HBM, and investors will watch 2025 capex updates from AMZN, MSFT, and GOOG for confirmation.
This article first appeared on GuruFocus.
 
Back
Top