Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/nvidia-backed-enfabrica-releases-system-aimed-at-easing-memory-costs.23258/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Nvidia-backed Enfabrica releases system aimed at easing memory costs

Daniel Nenni

Admin
Staff member
1753808076364.png


SAN FRANCISCO (Reuters) -Enfabrica, a Silicon Valley-based chip startup working on solving bottlenecks in artificial intelligence data centers, on Tuesday released a chip-and-software system aimed at reining in the cost of memory chips in those centers.

Enfabrica, which has raised $260 million in venture capital to date and is backed by Nvidia, released a system it calls EMFASYS, pronounced like "emphasis."

The system aims to address the fact that a portion of the high cost of flagship AI chips from Nvidia or rivals such as Advanced Micro Devices is not the computing chips themselves, but the expensive high-bandwidth memory (HBM) attached to them that is required to keep those speedy computing chips supplied with data. Those HBM chips are supplied by makers such as SK Hynix and Micron Technology.

The Enfabrica system uses a special networking chip that it has designed to hook the AI computing chips up directly to boxes filled with another kind of memory chip called DDR5 that is slower than its HBM counterpart but much cheaper.

By using special software, also made by Enfabrica, to route data back and forth between AI chips and large amounts of lower-cost memory, Enfabrica is hoping its chip will keep data center speeds up but costs down as tech companies ramp up chatbots and AI agents, said Enfabrica Co-Founder and CEO Rochan Sankar.

Rochan said Enfabrica has three "large AI cloud" customers using the chip but declined to disclose their names.


 
Tiered distributed DDR5 memory over Ethernet and CXL, with bandwidth multiplied by using striping. "Read access times in microseconds." At least it's better than NAND flash. The Nvidia backing part looks like their VC group. And Ethernet... packet dropping or Pausing Ethernet for distributed memories? I can't wait to see some system testing. :)

 
"Those HBM chips are supplied by makers such as SK Hynix and Micron Technology."

Funny that they did not mention Samsung :LOL:
Dan, you should be so thankful that 99.99% of the mainstream financial press knows practically nothing about the semiconductor industry, even when their job is to write about it. Think about how smart and competent it makes you look.
 
Dan, you should be so thankful that 99.99% of the mainstream financial press knows practically nothing about the semiconductor industry, even when their job is to write about it. Think about how smart and competent it makes you look.

Which is exactly why I started SemiWiki. I had never guessed that it would be this bad though. Hopefully AI will replace them soon.

I have friends at Efabrica. I showed them the article and they mentioned that Samsung is an investor! :LOL: They have silicon so I expect them to be acquired soon for billions of dollars. The CEO and others are from Broadcom.
 
Back
Top