Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/cerebras-to-raise-ipo-price-range-to-150-160-as-demand-surges-sources-say.25074/page-2
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2031070
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Cerebras to raise IPO price range to $150-$160 as demand surges, sources say

Three updates on this one:
Since Cerebras supports multi-user workloads on common hardware, doesn't this capability answer your question?
1) This article is much better than the S1 for us hardware types.


2) The article delves into the economics and potential Pareto curve and limitations of Cerebras - I'm not going to summarize them all here. Most important is that Cerebras is good at fast tokens and they are 6x more valuable (at least at current rates from OpenAI) vs normal tokens. No actual Pareto frontiers for Cerebras yet, though.

The company is considering a new IPO price range of $150-$160 a share, up from $115-$125 a share, and raising the number of shares marketed to 30 million from 28 million, said the sources, who asked not to be identified because the information isn't public yet.
At the ⁠top of the new range, Cerebras would raise roughly $4.8 billion, up from $3.5 billion under its original terms, though the figures remain subject to change before pricing, the people said.
3) Pricing is now $185 / share
 
Three updates on this one:

1) This article is much better than the S1 for us hardware types.


2) The article delves into the economics and potential Pareto curve and limitations of Cerebras - I'm not going to summarize them all here. Most important is that Cerebras is good at fast tokens and they are 6x more valuable (at least at current rates from OpenAI) vs normal tokens. No actual Pareto frontiers for Cerebras yet, though.


3) Pricing is now $185 / share
I read the section on the WSE-3 I/O networking earlier. I'm still going over it to make sure I understand what the authors are saying, but I'm not convinced the article accurately represents how the Cerebras system uses I/O. Yet.
 
I've often wondered... how much would Cerebras be worth if they achieved the same results without wafer-scale? How big is the wafer-scale premium? Can wafer-scale really be applied more broadly than AI? (I'm currently a skeptic.)
 
how much would Cerebras be worth if they achieved the same results without wafer-scale?
Not sure they could have. What would they have done - another Groq ?
How big is the wafer-scale premium?
That's a good question - maybe we'll get a better view on their Pareto cost / interactivity tradeoffs as they grow. And their collaboration with Amazon on disaggregation should be revealing in how well they can build heterogeneous systems that offer a broader set of model operating cost / interactivity points.
Can wafer-scale really be applied more broadly than AI?
Who knows ? They made a good bet on AI as the killer app back in 2016 - an app that benefits from huge scale / density / interconnect, plus plenty of high bandwidth SRAM with sufficient value that the added cost (yield, HW and SW R&D, cooling) is secondary. Maybe there will be another app like AI some day. But until then, there's still a lot of running room with AI in the data center.
 
Back
Top