Cerebras Challenges Nvidia’s Chip Dominance with Highly Hyped IPO
Cerebras says its chips can perform inference work faster than Nvidia’s GPUs, which are less specialized for inference work.
Sign up for smart news, insights, and analysis on the biggest financial stories of the day.
Cerebras is on track to pull off the biggest tech IPO of the year so far. Among all listings, it’s only second to the combined debut last month of Bill Ackman’s Pershing Square Inc. and Pershing Square USA Ltd.
The AI chipmaker on Monday said it plans to list its shares for $150 to $160 each, a jump it made based on high demand from its previous pricing of $115 to $125 a share. Altogether, its 30 million shares could bring in up to $4.8 billion, putting its market cap at a high end of more than $34 billion. Cerebras boosted the price after orders poured in for more than 20 times the number of available shares, Bloomberg reported.
Chips from Cerebras could make hyperscalers less reliant on the market leader Nvidia as companies figure out how to make running AI more cost-efficient.
Divvying Up Nvidia’s Work
Cerebras is known for chips that handle inference workloads, meaning they help AI answer users’ queries. Its chips help run AI after it has already been trained, which gives them an edge at a time when companies have switched their focus from building AI to efficiently keeping it humming along.
Cerebras caters to companies that are trying to reduce their reliance on pricey GPUs from Nvidia, which dominates the AI chip industry:
- The main chip Cerebras sells, called the Wafer-Scale Engine 3, could provide an alternative to Nvidia’s GPUs; Cerebras says its chips can perform inference work faster and use less power than Nvidia’s GPUs, which are less specialized towards inference work. Cerebras secured a $20 billion commitment in January from OpenAI to help supply the ChatGPT-parent with compute power, and Amazon Web Services said in March it’d bring Cerebras chips into its data centers. It’s a defining shift for Cerebras, which previously counted on G42, an Abu Dhabi-based AI company, for the majority of its revenue.
- At the same time, major tech companies including Amazon and Google are creating their own chips meant to take some of the workload away from Nvidia’s chips. Nvidia’s GPUs are still the gold standard for many AI operations, but a new wave of chips could make it so companies can supplement Nvidia’s GPUs with cheaper and more efficient sources of compute power.
Seeking Savings: AI is shifting to a new stage. Now that companies have trained their models, they’re investing in the infrastructure needed to keep them running. Chips that are specialized in running, rather than training, AI can save companies money. And the companies need that cost savings as onlookers question their massive capital expenditures. Layoffs alone aren’t enough to offset AI’s costs, especially in cases where AI’s been found to cost more than the human workers it’s looking to replace.












