Is This Company an "Nvidia Killer?" What to Know About Cerebras' IPO


This start-up’s CEO says he’s eyeing “all of” Nvidia’s market share.

Conventional wisdom is that Nvidia (NVDA 1.69%) will continue to dominate the artificial intelligence (AI) chip market, as it has since the introduction of ChatGPT. Yet, there’s a barrage of competition coming not only from merchant competitors and cloud giants producing their own in-house accelerators but also from AI chip start-ups.

One such start-up, Cerebras, just filed a prospectus ahead of an impending initial public offering (IPO). After reading, I think Cerebras is a name every Nvidia investor should monitor closely. But is it really a threat to the graphics processing unit (GPU) giant?

Technician holds up a semiconductor wafer.

Image source: Getty Images.

What is Cerebras?

Cerebras was founded in 2016 by current CEO Andrew Feldman and a group of technologists who had founded and/or worked at a company called SeaMicro over a decade ago. SeaMicro made efficient high-bandwidth microservers and was later acquired by Advanced Micro Devices in 2012.

Cerebras sold its first AI chips in 2019 and has recently seen a big acceleration in demand, leading to this recent IPO filing.

Cerebras’ giant chip

Cerebras’ big differentiator is that its AI chips, which it calls wafer-scale engines (WSEs), are huge. And by huge, we’re talking a chip that takes up an entire semiconductor wafer. A foundry usually produces many chips per wafer, some of which have defects and are discarded. But Cerebras goes for one giant chip per wafer.

The result is a massive processor 57 times larger than an Nvidia GPU, with 52 times more compute cores, 880 times the on-chip memory, and 7,000 times more memory bandwidth. One Cerebras WSE has a remarkable 4 trillion transistors — that’s 50 times the 80 billion transistor count of Nvidia’s H200! Like Nvidia, Cerebras’ chips are produced by Taiwan Semiconductor Manufacturing.

The theory behind making a giant chip is that by doing more processing on the chip, the WSE does away with the need for the Infiniband or Ethernet-based networking connections that string hundreds or thousands of GPUs together. According to Cerebras, this architecture allows WSEs to achieve over 10 times faster training and inference than an 8-GPU Nvidia system.

In a recent interview, Feldman said recent tests showed Cerebras chips were 20 times faster for inference than Nvidia’s. Sound impressive? When Feldman was asked at a summer conference how much market share Cerebras planned to take from Nvidia, he answered, “All of it.”

Financials show a big acceleration

Not only does Cerebras talk a big game, but it’s also shown impressive revenue acceleration and improving profitability this year, as you can see:

Cerebras (Nasdaq: CBRS)

H1 2023

H1 2024

Hardware revenue

$1,559

$104,269

Service revenue

$7,105

$32,133

Total revenue

$8,664

$136,402

Gross profit

$4,378

$56,019

Operating profit (loss)

($81,015)

($41,811)

Data source: Cerebras S-1. H1 = first half of the corresponding year.

As you can see, between the first half of 2023 and the first half of 2024, Cerebras’ revenue jumped a whopping 1,474%. While gross margin technically declined, from 50.5% to 41.1%, that was mainly because virtually all of last year’s revenue came from higher-margin services. Cerebras’ hardware gross margins actually went up over that time. Even better, operating losses narrowed by $40 million, a great indication that the company will be profitable if it scales.

That exponential scaling should continue into next year. According to the filing, Cerebras’ largest customer, Abu Dhabi’s G42, agreed to purchase $1.43 billion of equipment through the end of 2025. That’s sixfold growth over the current 2024 run rate.

Risks to the Cerebras story

There are a couple of risks to the Cerebras story, however. One is that producing one massive chip can lead to lots of defects. Whereas Nvidia or any other chipmaker can throw out all the bad chips on a wafer, Cerebras has to take the whole thing, opening its WSEs to imperfections.

To get around this, Cerebras says it has created “redundant” cores and interconnects on its chips, as Cerebras assumes many chips will have defects. “Flaws are designed to be recognized, shut down, and routed around,” the filing says.

However, building redundancy also means Cerebras can’t get all the potential the surface area of its chip could otherwise get. Obviously, management believes the “big chip” architecture more than makes up for this inefficiency.

A second risk, and likely the biggest, is Cerebras’ customer concentration. Right now, AI company G42 from the United Arab Emirates accounts for 87% of Cerebras’ sales in the first six months of 2024. G42 and affiliated entities are also behind next year’s $1.43 billion order, meaning that concentration will only grow.

Concentration is somewhat expected in the early stages of a company’s growth. But should anything go wrong with the relationship or G42 itself, it could seriously derail Cerebras’ plans. G42’s close affiliation with a foreign government — the UAE’s national security advisor is the company’s founder and largest shareholder — certainly poses a risk should there be a geopolitical flare-up.

Cerebras is one to watch

When it goes public, Cerebras will be a new AI player on the block and will probably sell for a high valuation. So, investors should be cautious about how much they pay for the stock when it comes to market.

Nevertheless, the company has a differentiated architecture from the rest of the pack. Therefore, it’s certainly worth watching whenever it goes public — especially if you’re a big Nvidia or AMD shareholder.



Source link

About The Author

Scroll to Top