Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/meta-announces-4-new-ai-chips-raising-competitive-stakes-with-nvidia-amd.24724/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030970
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Meta announces 4 new AI chips, raising competitive stakes with Nvidia, AMD

Daniel Nenni

Admin
Staff member
Meta (META) has debuted four new AI chips as part of its Meta Training and Inference Accelerator family of in-house processors. The new chips are part of the social media giant’s efforts to take advantage of both commercial GPUs from Nvidia (NVDA) and AMD (AMD), as well as its own offerings, to not only meet artificial intelligence demands but also ensure it isn’t overly reliant on one vendor.

The chips — the MTIA 300, MTIA 400, MTIA 450, and MTIA 500 — are designed to focus on different elements of Meta’s AI business, including its ranking and recommendations (R&R) models up to high-end inferencing.

The MTIA 400 is meant for generative AI as well as R&R processes and, according to Meta, can be strung together in a larger server rack with 72 chips. It’s a similar idea to Nvidia’s NVL72 or AMD’s Helios racks.

Meta claims that the MTIA 400 is its first chip that provides cost savings as well as “raw performance competitive with leading commercial products.” The company doesn’t specifically say what products those include, but the only major commercial products similar to the MTIA 400 would be from Nvidia and AMD.

Interestingly, Meta recently signed multiyear, multigenerational deals for chips from both companies.

Nvidia CEO Jensen Huang holds a Rubin GPU during a Nvidia keynote address at CES 2026, an annual consumer electronics trade show, in Las Vegas, Nevada, U.S. January 5, 2026.  REUTERS/Steve Marcus

Nvidia CEO Jensen Huang holds a Rubin GPU during a Nvidia keynote address at CES 2026, an annual consumer electronics trade show, in Las Vegas, Nev., Jan. 5, 2026. REUTERS/Steve Marcus ·REUTERS / Reuters

The MTIA 450 processor takes things further than the MTIA 400, with faster high-bandwidth memory, while the MTIA 500 adds more memory with even faster speeds.

The company says it’s already begun using some chips and plans to deploy the others in 2026 or 2027. Importantly for the company, they all use the same basic infrastructure, so Meta can swap them out when they need to be upgraded.

Meta isn’t the only company leaning into using its own processors to power its AI capabilities. Google (GOOG, GOOGL) and Amazon (AMZN) have been using their own chips to train and run AI models for years, and Microsoft (MSFT) has its new Maia 200 processor.

Google and Amazon also rent out their chips to Anthropic, which uses them to run its AI models. And more recently, The Information reported that Google and Meta signed a multibillion-dollar deal that will see Meta use Google’s processors.

All of this stands as a potential headwind for both Nvidia and AMD. During its most recent earnings announcement, Nvidia CFO Colette Kress reported that “slightly more than 50%” of the company’s data center revenue was attributable to hyperscalers.

Still, she noted that revenue growth came from its remaining customers.

Hyperscalers don’t show any signs of slowing down their spending on AI data centers or third-party chips, either. In 2026 alone, Amazon, Google, Meta, and Microsoft plan to collectively spend $650 billion in capital expenditures, with the bulk of it going toward AI.

 
The in-house chips will also make the hyperscalers even more competitive against smaller datacenters - further putting strain on Nvidia's available customer pool. Not a problem in 2026, but 2030 might be a different story.
 
ARM is helping with this. Arm is going big on the custom ASIC business. I do not see any of this denting Nvidia. Look at the deals Jensen is pulling off!

Jensen Huang was at the Synopsys event today. That guy is having too much fun. GTC next week!
 
Back
Top