close
close
migores1

Is this company an “Nvidia Killer?” What you need to know about the Cerebras IPO

The startup’s CEO says he’s after “all” of Nvidia’s market share.

The conventional wisdom is that Nvidia (NVDA 1.69%) will continue to dominate the artificial intelligence (AI) chip market as it has done since the introduction of ChatGPT. However, there is a wave of competition coming not only from merchant competitors and cloud giants producing their own in-house accelerators, but also from AI chip start-ups.

One such start-up, Cerebras, has just filed a prospectus ahead of an imminent initial public offering (IPO). After reading, I think Cerebras is a name that every Nvidia investor should monitor closely. But is it really a threat to the graphics processing unit (GPU) giant?

The technician picks up a semiconductor board.

Image source: Getty Images.

What is Cerebras?

Cerebras was founded in 2016 by current CEO Andrew Feldman and a group of technologists who had founded and/or worked at a company called SeaMicro over a decade earlier. SeaMicro made efficient high-bandwidth microservers and was later acquired by Advanced microdevices in 2012.

Cerebras sold its first AI chips in 2019 and recently experienced a big acceleration in demand, leading to this recent IPO filing.

The giant brain chip

Cerebras’ big differentiator is that its AI chips, which it calls wafer-scale engines (WSEs), are huge. And by huge, we mean a chip that takes up an entire semiconductor board. A foundry typically produces many chips per wafer, some of which are defective and discarded. But Cerebras goes for a giant chip per wafer.

The result is a massive processor 57 times larger than an Nvidia GPU, with 52 times more compute cores, 880 times more on-chip memory and 7,000 times more memory bandwidth. A Cerebras WSE has a remarkable 4 trillion transistors – that’s 50 times the 80 billion transistor count of Nvidia’s H200! Like Nvidia, Cerebras chips are made by Taiwan Semiconductor Manufacturing.

The theory behind creating a giant chip is that by doing more processing on-chip, WSE eliminates the need for Infiniband or Ethernet-based network connections that join hundreds or thousands of GPUs together. According to Cerebras, this architecture allows WSEs to perform training and inference more than 10 times faster than an 8-GPU Nvidia system.

In a recent interview, Feldman said recent tests showed Cerebras’ chips to be 20 times faster for inference than Nvidia’s. Sound impressive? When Feldman was asked at a summer conference how much market share Cerebras plans to take from Nvidia, he replied, “Everything.”

Financial data shows a big acceleration

Not only does Cerebras talk a big game, but it also showed impressive revenue acceleration and improved profitability this year, as you can see:

The brains (Nasdaq: CBRS)

S1 2023

S1 2024

Hardware revenue

$1,559

$104,269

Revenue from services

$7,105

$32,133

Total income

$8,664

$136,402

Gross profit

$4,378

$56,019

Operating profit (loss).

($81,015)

($41,811)

Data source: Cerebras S-1. H1 = first half of the corresponding year.

As you can see, between the first half of 2023 and the first half of 2024, Cerebras’ revenue grew by 1,474%. While gross margin technically fell, from 50.5% to 41.1%, that was mainly due to the fact that almost all of last year’s revenue came from higher-margin services. Cerebras’ hardware gross margins actually increased during that period. Even better, operating losses were reduced by $40 million, an excellent indication that the company will be profitable if it expands.

That exponential scaling should continue next year. According to the document, Cerebras’ largest customer, Abu Dhabi’s G42, has agreed to purchase $1.43 billion worth of equipment by the end of 2025. This is a six-fold increase from the current run rate in 2024.

Risks to the story of Cerebros

There are some risks in Cerebras’ story though. One is that producing a massive chip can lead to a lot of defects. While Nvidia or any other chip maker can throw all the bad chips off a wafer, Cerebras has to take everything, opening its WSEs to imperfections.

To get around this, Cerebras says it has created “redundant” cores and interconnects on its chips because Cerebras assumes many chips will have defects. “The flaws are designed to be recognized, stopped and targeted,” the filing states.

However, building in redundancy also means that the Cerebrum can’t achieve the full potential its chip surface could otherwise achieve. Obviously, management believes that the “big chip” architecture more than compensates for this inefficiency.

A second risk, and perhaps the biggest, is Cerebras’ focus on customers. Currently, UAE-based AI company G42 accounts for 87% of Cerebras’ sales in the first six months of 2024. G42 and related entities are also after next year’s $1.43 billion order, meaning concentration will only increase.

Concentration is somewhat expected in the early stages of a company’s growth. But if something goes wrong with the relationship or G42 itself, it could seriously derail Cerebras’ plans. The G42’s close affiliation with a foreign government — the UAE’s national security adviser is the company’s founder and largest shareholder — certainly poses a risk should there be a geopolitical eruption.

Cerebras is one to watch

When it goes public, Cerebras will be a new AI player on the block and will likely sell at a high valuation. So investors should be careful about how much they pay for stocks when it comes to the market.

However, the company has an architecture that stands apart from the rest of the pack. So it’s definitely worth watching whenever it goes public — especially if you’re a big Nvidia or AMD shareholder.

Related Articles

Back to top button