Cerebras Is Going Public at $40 Billion. Here's Why ...

Cerebras Is Going Public at $40 Billion. Here's Why ...

Cerebras Systems — the AI chip startup that builds processors the size of a dinner plate — just filed for the biggest tech IPO of 2026. The numbers are almost hard to believe: a $40 billion valuation, up to $4 billion raised, and a $20 billion deal with OpenAI already in hand. Most people are looking at this and seeing another startup cashing out. That's the safe take. But dig a little deeper, and you'll find thisIPO tells us something crucial about where the entire AI industry is headed — and why the next twelve months matter more than anything we've seen so far.

The surface story is impressive enough. In 2025, Cerebras pulled in $510 million in revenue — a 76% jump from the year before. More striking: it turned a $481 million loss into $238 million in net income. The company has a $24.6 billion backlog of unfilled orders sitting on its books. These aren't speculative promises either — they're contracts already signed.

But here's what nobody's talking about enough: this isn't just about chips. It's about a fundamental shift in how AI companies think about compute.

The Wafer-Scale Bet That's Finally Paying Off

Cerebras made a controversial bet back in 2019 when it unveiled the Wafer-Scale Engine — a single silicon wafer packed with 1.2 trillion transistors, covering 46,225 square millimeters. The entire industry laughed. The technical challenges were insane: a chip that big had never been manufactured at scale, and early prototypes had alarming failure rates. Traditional wisdom said smaller chips networked together was the only viable approach.

The bet was simple: when you're running inference at massive scale, the communication between chips becomes the bottleneck, not the chips themselves. By building one continuous piece of silicon, Cerebras claimed it could eliminate that bottleneck entirely.

Fast forward to 2026, and that bet is driving the company's valuation. The OpenAI deal alone is worth over $10 billion — not $20 billion as some headlines claimed, but $10.6 billion according to the updated S-1 filing covering 750 megawatts of inference capacity through 2028. That's enough compute to power a small city, dedicated to serving AI queries.

Why This IPO Matters — Even If You Don't Buy the Stock

Here's the part that's easy to miss: Cerebras isn't trying to replace Nvidia. That's the misconception that could cost investors serious money.

Nvidia dominates training — the phase where AI models learn from data. That's where most of the capital has flowed, and that's where Nvidia's moat is deepest. Cerebras is targeting a different slice: inference, the phase where trained models answer queries in real-time. When you ask ChatGPT a question, that's inference. When you generate an image, that's inference. And here's what most people don't realize: as AI systems scale, inference costs eventually dwarf training costs.

Nvidia's own financial projections acknowledge this. The chip giant has been quietly building inference capacity for years. Cerebras is arguing it can do it more efficiently for specific workloads — particularly the large-scale, high-volume inference that major AI labs need.

This is the real story: it's not Nvidia vs. Cerebras. It's complementary. And whichever company wins the inference race will likely be the more valuable of the two in five years.

The Red Flags Nobody's Mentioning

Let's be honest about the risks. First — customer concentration. In 2025, just two customers generated 86% of Cerebras' revenue. That's customer concentration risk in its most extreme form. Yes, OpenAI is one of them, and yes, the deal is enormous. But if OpenAI pivots its strategy or decides to build in-house inference, Cerebras faces a massive revenue hole.

Second — the valuation math is aggressive. $40 billion at $4 billion in offering size means the company is asking for 70x trailing revenue. Nvidia trades at roughly 23x trailing sales. Even accounting for growth, that's a steep premium.

Third — multi-class share structure. Class B shares carry 20 votes per share. Early investors and insiders retain voting control despite owning less than half the equity. This isn't unusual for tech IPOs, but it matters for minority shareholders.

Finally, the $24.6 billion in backlog sounds impressive, but "remaining performance obligations" isn't revenue. Industry analysts estimate only 15% gets recognized in 2026-2027. The rest is years away, contingent on everything going right.

What Happens Next?

If the IPO prices successfully in mid-May, expect immediate volatility. The $10 billion in preorders already reported suggests strong institutional demand, but that's before institutional investors have seen the full S-1 revisions.

Two scenarios: if Cerebras proves it can diversify beyond its top two customers and delivers on the OpenAI buildout, the valuation could sustain. If execution stumbles or AI infrastructure demand softens, we could see a repeat of late-2025 listings that priced below their ranges.

The bigger picture is simpler: this is the first major AI infrastructure IPO of the cycle, and it will set the tone for every listing that follows. Watch Cerebras carefully — it's not just a stock, it's a referendum on how the market values the AI buildout.