Cerebras: From Red to Green?

An analyst uses Daloopa's MCP in Claude to analyze Cerebras (ticker: CBRS) and highlight how Daloopa fundamentals power company insights.

Using Daloopa’s MCP in Claude to Highlight a Changed Company

When Cerebras (ticker: CBRS) filed its S-1 in September 2024, I was skeptical (prior post,). The headline numbers told a troubling story: a single customer, UAE-based G42, accounted for 95–97% of hardware sales in 2023 and the first half of 2024 and the CEO had a well-documented history of accounting impropriety. The red flags were hard to ignore.

Today, roughly 18 months later, that same company has signed a $10+ billion compute deal with OpenAI, raised $1 billion at a $23 billion valuation, and is preparing to refile for a public listing in Q2 2026. So what changed and what still deserves further exploration?

What Has Changed Since Then

1. The G42 Customer Concentration Problem Has Materially Improved

This was the single biggest structural risk in the original filing, and it has meaningfully improved, though not because G42 disappeared from the picture.

In January 2026, Cerebras announced a multi-year compute agreement with OpenAI worth over $10 billion, covering 750 megawatts of AI inference capacity through 2028. That single deal is transformational: it introduces a blue-chip, U.S.-domiciled customer at a scale that simply cannot be dismissed as a side relationship.

Beyond OpenAI, Cerebras has publicly disclosed commercial relationships with AWS, Meta, IBM, Mistral, Cognition, AlphaSense, Notion, GlaxoSmithKline, the Mayo Clinic, the U.S. Department of Energy, and the U.S. Department of Defense — a roster that looks nothing like the 2024 filing.

The G42 concentration figure (87% of H1 2024 revenue) remains the most recently publicly disclosed number, and prospective investors will need to review the updated S-1 to confirm the improvement. But the direction is clear.

Metric 2024 S-1 Filing Current Picture (2026)
Primary Customer G42 (87–97% of revenue) Diversified: OpenAI, Meta, IBM, DoD, Mayo Clinic, others
Largest Known Deal G42 purchase orders OpenAI: $10B+ compute agreement (750MW through 2028)
Valuation $4B (2021 last round) $23B (Series H, Feb 2026)
Funding Stage Pre-IPO, S-1 filed S-1 withdrawn, refile targeted Q2 2026
CFIUS Status Under review (blocked IPO) Resolved (March 2025)
Lead Bank Citigroup Citigroup + Barclays (Series G placement); IPO bankers TBD
Gross Margin ~41% (H1 2024, compressed by G42 discounts) Not yet disclosed in updated filing
Revenue Run Rate ~$70M in Q2 2024 Projected $300–$350M full-year 2025 (per analyst estimates)

2. The CFIUS Overhang Is Resolved

In March 2025, CFIUS cleared Cerebras’s request to allow G42 to hold its minority investment. This was the decisive gate that had stalled the original IPO indefinitely. With that review complete, the path to the public markets reopened.

This is not a trivial update. CFIUS reviews of semiconductor companies with Middle Eastern backers were a genuine wildcard through 2024. The clearance removes a regulatory binary risk that was essentially unquantifiable.

3. The Fundraising Trajectory Is Remarkable

The valuation progression tells its own story:

  • 2021 Series F: $4 billion valuation
  • September 2025 Series G: $8.1 billion valuation ($1.1B raised, oversubscribed)
  • February 2026 Series H: $23 billion valuation ($1B raised, led by Tiger Global)

The Series H round saw participation from Benchmark, Fidelity, Atreides, Alpha Wave, Altimeter, AMD, Coatue, and 1789 Capital. AMD’s participation is notable — a strategic signal from Nvidia’s most prominent public-market rival that alternative chip architectures warrant a seat at the table.

The nearly tripling of valuation in five months is extraordinary, and also raises the first question any public market investor should ask: at what multiple does this become a leap of faith?

4. The Technology Has Earned Third-Party Validation

Claims about the Wafer Scale Engine’s performance advantages are no longer purely self-reported. Independent benchmarking firm Artificial Analysis has consistently ranked Cerebras as the fastest AI inference provider across hundreds of models. The company claims speeds more than 20 times faster than Nvidia GPUs on comparable workloads.

The OpenAI deal is the most powerful external validator available — OpenAI evaluated the architecture, ran technical due diligence, and committed over $10 billion. That is not a marketing relationship; it is a production infrastructure dependency.

The company also reports serving trillions of tokens per month and holding the top spot on Hugging Face for inference providers with over 5 million monthly requests.

What Still Warrants Exploration

The story has materially improved. But a balanced read requires holding onto the legitimate open questions:

Valuation vs. Revenue

The $23 billion valuation implies a very high multiple on disclosed financials. Revenue was approximately $136 million in H1 2024; analysts tracking the company’s filings estimate full-year 2025 revenue in the $300–$350 million range. At $23 billion on ∼$350 million in revenue, that is roughly a 65x multiple a number that requires flawless execution and continued hyper-growth to justify.

For comparison, Nvidia trades at elevated multiples but with a different scale of revenue and demonstrated profitability. The public market will price Cerebras differently than private-round investors who are willing to underwrite an option on dominance.

Gross Margin Volatility

The 2024 S-1 disclosed gross margins that had improved from 11.7% in 2022 to 33.5% in 2023, then fell back to approximately 41% in early 2024 before compression from G42 volume discounts. The OpenAI deal structure (cloud services rather than direct hardware sales) may carry different margin characteristics. The updated S-1 will be essential reading on this point.

Execution at Scale: 750MW Is a Large Commitment

Delivering 750 megawatts of wafer-scale compute capacity through 2028 is an ambitious infrastructure build. Wafer-scale chips are notoriously difficult to manufacture at high yield. Any supply chain disruption, manufacturing defect rate issue, or delay in data center buildout would affect ChatGPT’s user experience and would be very public. The operational risk here is real.

Single-Product Architecture Risk

Cerebras competes against Nvidia, AMD, and a growing field of AI chip startups (Groq, SambaNova, etc.) with a highly differentiated but concentrated bet on wafer-scale design. If the next generation of AI workloads shifts away from the inference use cases where Cerebras excels — for example, if reasoning models become cheaper to run on commodity hardware — the competitive advantage could narrow.

Sam Altman’s Dual Role

OpenAI CEO Sam Altman is an early personal investor in Cerebras. The $10 billion deal was struck between an OpenAI whose CEO has a financial stake in its supplier. This conflict-of-interest dynamic will likely invite scrutiny.


The Bottom Line

When I wrote the original piece, the bear case was straightforward: single-customer dependency, regulatory uncertainty, non-standard banking choices, and a CEO with a checkered past. I stood by the skepticism.

Today, the bear case is more nuanced. The structural customer concentration risk has been addressed by the most credible counterparty imaginable. The regulatory overhang is cleared. The technology has earned legitimate third-party validation. The funding trajectory reflects genuine institutional conviction.

What remains is a valuation question. Cerebras is no longer a company where the main concern is whether the business is genuine. The current questions are: Can it sustain differentiation as Nvidia keeps improving? Can it actually provide 750MW of compute at the marginal profile suggested by the OpenAI deal? Can it turn its inference dominance into a lasting position in training, where Nvidia’s moat remains largely unchallenged?

The updated S-1, expected in Q2 2026, will be the most important document for answering those questions with actual numbers. Watch the gross margin disclosure on cloud/inference contracts, the revenue breakdown by customer post-OpenAI, and whatever the company discloses about backlog conversion.

In 2024, I would have told you to wait for more evidence. In 2026, the evidence has arrived — and it is more interesting than I expected. That doesn’t mean the valuation is right. It means the analysis just got harder.


This analysis is for informational purposes only and does not constitute investment advice. Financial data sourced from Cerebras public filings, press releases, and third-party reporting through March 2026.

Subscribe to our newsletter

Stay in the loop with our monthly newsletter.

Related Posts