Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/cerebras-to-raise-ipo-price-range-to-150-160-as-demand-surges-sources-say.25074/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2031070
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Cerebras to raise IPO price range to $150-$160 as demand surges, sources say

Daniel Nenni

Founder
Staff member
1778511979039.png


- Cerebras to raise IPO price range, share count amid strong demand, sources say
- Orders exceed 20 times available shares as AI chip demand surges, they say
- IPO would be largest in the world so far in 2026, according to Dealogic

May 10 (Reuters) - Cerebras Systems is set to raise the size and price of ‌its initial public offering as soon as Monday, as demand for the artificial intelligence chipmaker's shares continues to climb, two people familiar with the matter told Reuters on Sunday.

The company is considering a new IPO price range of $150-$160 a share, up from $115-$125 a share, and raising the number of shares marketed to 30 million from 28 million, said the sources, who asked not to be identified because the information isn't public yet.

At the ⁠top of the new range, Cerebras would raise roughly $4.8 billion, up from $3.5 billion under its original terms, though the figures remain subject to change before pricing, the people said.

The increase follows a broader surge in AI adoption that has driven sharp demand for high-performance chips and turned semiconductors into a key bottleneck in the technology supply chain. Cerebras' IPO has drawn orders for more than 20 times the number of shares available, the people said, as the chipmaker looks to manage surging interest ahead of its May 13 pricing.

Cerebras did not immediately respond to a request for comment.

Bloomberg News previously reported the company planned to raise the price range of the IPO to $125-$135 per share.

Sunnyvale, California-based Cerebras makes ‌specialized chips ⁠for running advanced AI models in a market dominated by Nvidia.Cerebras is seeing surging demand for its processors as AI labs shift from training models to deploying them. Cerebras' chips are better suited for inference, the computations that allow AI models to respond to user queries, than the GPU chips the industry has long relied on for model training.

The IPO next week would mark ⁠Cerebras' second attempt to go public - the company first filed for an IPO in 2024 but pulled that plan last year. Its partnership with G42, a UAE-based AI company that provided more than 80% of its revenue in the first half of 2024, had ⁠drawn a national security review by the Committee on Foreign Investment in the United States. The committee eventually cleared the deal.

Since then, Cerebras has secured Amazon and OpenAI, two of the biggest builders of AI infrastructure in the world, as ⁠customers.
The listing would be the biggest IPO globally so far this year, according to Dealogic.

The offering is being led by Morgan Stanley, Citigroup, Barclays and UBS Group AG. Cerebras plans for its shares to trade on the Nasdaq Global Select Market under the symbol CBRS.

 
could be the best IPO timing of all time
It is interesting - they have reached the market with an unassailable differentiator (working wafer-scale AI oriented systems with tons of on-wafer SRAM, and I/O bandwidth), precisely as the largest and fastest growing AI market (data center inference) demands those capabilities. The tough question for them is whether they can optimize the inference hardware and software stack sufficiently to address the whole market as it evolves. or just the premium low latency segment.
 
It is interesting - they have reached the market with an unassailable differentiator (working wafer-scale AI oriented systems with tons of on-wafer SRAM, and I/O bandwidth), precisely as the largest and fastest growing AI market (data center inference) demands those capabilities. The tough question for them is whether they can optimize the inference hardware and software stack sufficiently to address the whole market as it evolves. or just the premium low latency segment.
Since Cerebras supports multi-user workloads on common hardware, doesn't this capability answer your question?
 
Since Cerebras supports multi-user workloads on common hardware, doesn't this capability answer your question?
Would love to see where they are on the Pareto frontiers for TCO (cost per million tokens) vs interactivity (TPS and TTFT latency) for a large-scale many user request benchmark, like this. I suspect they would show up as high TCO but extremely low latency, and wouldn't have as much of a tradeoff space as NVIDIA or AMD.
 

Attachments

  • InferenceX_DeepSeek-V4-0528_e2e-1778526341235.png
    InferenceX_DeepSeek-V4-0528_e2e-1778526341235.png
    372 KB · Views: 1
  • InferenceX_DeepSeek-V4-0528_interactivity-1778526336466.png
    InferenceX_DeepSeek-V4-0528_interactivity-1778526336466.png
    377.2 KB · Views: 1
Back
Top