Cerebras Wants $26.6B. Can It Actually Crack Nvidia's Moat?
AI chipmaker Cerebras is going public on Nasdaq, targeting up to $3.5B in its IPO. With an OpenAI deal and real profits, the pitch is compelling—but the risks are hiding in plain sight.
OpenAI runs its latest AI model on Cerebras chips—not Nvidia's. That sentence alone is why Wall Street is paying attention.
Cerebras Systems filed an updated prospectus Monday, pricing its Nasdaq IPO at $115 to $125 per share. At 28 million shares, the offering could raise up to $3.5 billion and value the company at as much as $26.6 billion—up from the $23 billion valuation it commanded just three months ago in a venture round that counted Advanced Micro Devices among its backers.
The Numbers Behind the Hype
Unlike most AI darlings, Cerebras can point to actual profits. Fourth-quarter revenue grew 76% year over year to $510 million, and the company posted $87.9 million in net income for the period. In a sector littered with cash-burning startups, that matters.
But the single most important line in the prospectus isn't the revenue figure—it's a contract. In January, Cerebras announced it would supply OpenAI with up to 750 megawatts of AI computing power through 2028, in a deal valued at over $20 billion. That one agreement effectively pre-sells a significant chunk of Cerebras' capacity for the next three years, transforming what might otherwise look like a speculative chip bet into something closer to a contracted infrastructure play.
CEO and co-founder Andrew Feldman isn't selling a single share in the IPO. His post-IPO stake of 10.3 million shares would be worth up to $1.28 billion at the top of the range. Founders who don't sell at IPO tend to signal one of two things: genuine conviction, or shares already locked up in ways that prevent it. The filing suggests the former.
Why This IPO Is Different—And Why It Isn't
Cerebras tried this once before. It filed for an IPO in 2024, then quietly withdrew the paperwork as its business model was mid-pivot—shifting from selling hardware outright to operating a cloud service built on its own chips. The second attempt arrives with a cleaner story: a revenue-generating cloud service, a marquee customer in OpenAI, and a market that has grown considerably more receptive to AI infrastructure plays.
Competitor CoreWeave—which rents out Nvidia GPUs as a cloud service—raised $1.5 billion in its own IPO last year despite being unprofitable. Cerebras is profitable and has a differentiated chip. On paper, the case for a higher multiple seems straightforward.
The complication is Nvidia. Cerebras' Wafer Scale Engine (WSE) is architecturally distinct—a single chip the size of an entire silicon wafer, designed to handle certain AI inference workloads faster and more efficiently than clusters of GPUs. The technology is real. But Nvidia's advantage isn't just performance; it's the CUDA software ecosystem, the deep integrations with every major cloud provider, and the sheer inertia of enterprise procurement. AMD has been trying to chip away at Nvidia's dominance for years with limited success. Cerebras is a much smaller company with a far less mature software stack.
Who Wins, Who Watches Nervously
For early-stage venture investors, this IPO is a liquidity event. For retail investors eyeing the $115–$125 range, the math requires scrutiny. At $26.6 billion, Cerebras trades at hundreds of times its quarterly net income—a valuation that prices in years of flawless execution and continued AI infrastructure spending. That's not necessarily wrong, but it's not a margin-of-safety investment.
The broader market context matters too. IPO windows have been narrow since central banks raised rates in 2022. Cerebras' decision to push forward now reflects a calculated read that the window is open again—AI enthusiasm is running high, and waiting risks losing momentum. The company also has an option to sell an additional 4.2 million shares to underwriters after the IPO, potentially adding another $525 million at the top of the range.
For the semiconductor industry, the more interesting question is structural. If Cerebras succeeds—not just in going public, but in actually scaling its cloud service—it demonstrates that vertically integrated AI compute (custom chip + proprietary cloud) is a viable business model outside of the hyperscalers. That would have implications for how investors value similar bets, and how enterprises think about their AI infrastructure choices.
This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.
Related Articles
Nvidia closed at an all-time high as Intel posted its best day since 1987. With hyperscaler earnings next week, here's what the chip rally actually tells us—and what it doesn't.
Cerebras files for IPO with a $20B OpenAI deal in hand. What does this mean for Nvidia's dominance, AI infrastructure investment, and the next wave of chip competition?
TSMC posted a 58% profit jump and its fourth consecutive record quarter. As AI chip demand reshapes the semiconductor industry, here's what it means for investors, competitors, and the global tech supply chain.
Alibaba and China Telecom launched a 10,000-chip AI data center in Guangdong powered by Alibaba's homegrown Zhenwu semiconductors. What does China's accelerating chip self-sufficiency mean for Nvidia, global AI competition, and your portfolio?
Thoughts
Share your thoughts on this article
Sign in to join the conversation