Don't Worry, AI Investors, the Artificial Intelligence Boom Is Still on — But There Are Rising Dangers for Nvidia


Despite there being no signs of the artificial intelligence (AI) boom slowing down, AI stocks writ large have been hammered in 2025 due to the Trump Administration’s tariff threats and policies.

AI stocks have obviously risen significantly over the past two years, so they did come into the year at relatively high valuations. And the threat of economic recession has legitimately called into question the massive AI investments forecasted by leading tech companies earlier this year.

Fortunately, on Wednesday, April 9, the administration pared back the most extreme reciprocal tariffs. Also last week, several big tech CEOs reiterated their huge AI investment plans this year while indicating that AI demand remains incredibly strong.

But with those bullish comments also came some remarks that pose risks for the biggest AI company of all: Nvidia (NVDA 2.91%).

AI appears recession-proof

First, the good news. Even amid the tremendous market turmoil, the AI revolution still appears to be in full swing. In fact, two “Magnificent Seven” company CEOs confirmed this just this week, even as markets plunged.

First, Alphabet (GOOG 2.56%) (GOOGL 2.79%) held its Google Cloud Next 2025 event last week, which probably flew under many investors’ radars. At the event, not only were there a number of exciting announcements, especially with regard to Google’s new Gemini 2.5 model, but CEO Sundar Pichai also confirmed Alphabet’s previously announced plans to spend a massive $75 billion on AI data centers this year. Pichai added that spending is yielding good returns, saying, “The opportunity with AI is as big as it gets.”

And not only did Alphabet executives talk bullishly about their AI offerings and Google Cloud, but Google’s cloud customers did, too. At the event, customer Intuit confirmed that it was “doubling down” on AI efforts, while another big customer, Verizon, described huge benefits from using Google’s AI models.

Meanwhile, it’s not just the tariff fallout but also the introduction of China’s low-cost model DeepSeek R1 in January that has roiled AI stocks. But on Thursday, Amazon (AMZN 2.01%) CEO Andy Jassy put concerns over the need for all that spending to rest in an interview on CNBC. He noted:

People get confused. And we saw this with AWS [Amazon Web Services] too, which is, customers love when you take the cost per unit of something down, it allows them to save money on what they’re doing, but they don’t actually spend less. It actually unleashes them to do a lot more innovation, and in absolute they spend more.

In his letter to shareholders, also published on Thursday, Jassy noted, “Generative AI is going to reinvent virtually every customer experience we know, and enable altogether new ones about which we’ve only fantasized.” Jassy also reiterated that Amazon is seeing triple-digit growth rates in AI revenues.

So, while there certainly is cause for concern at the macro level, technology insiders still strongly believe generative AI will transform the world, and none of them want to be left behind when it does. That likely means AI spending will continue to be resilient, regardless of the economy.

Two cloud giants pledge to lower AI costs, taking it to Nvidia

While the AI revolution remains intact, there are certainly changing dynamics, especially around the costs of AI. Those concerns would only be amplified in a recession.

That could make things incrementally more difficult for Nvidia (NVDA 2.91%). Up until now, Nvidia has been synonymous with the AI buildout, and demand for its new Blackwell chip seems incredibly strong.

However, in addition to making bullish comments on AI, both Amazon and Google noted their tremendous efforts to bring down the cost of AI. Jassy, in particular, didn’t mince words when he noted that the costs of AI have to come down:

AI does not have to be as expensive as it is today, and it won’t be in the future. Chips are the biggest culprit. Most AI to date has been built on one chip provider. It’s pricey.

There’s no secret as to whom Jassy is talking about: Nvidia. Nvidia’s chips can run anywhere between $30,000 and $40,000 per chip. So, when one hears about hundred-thousand-GPU or even million-GPU clusters, this is why the AI buildout costs so much. And with Nvidia making 75% gross margins or even higher, one could say that Nvidia may be over-earning today — that is, of course, if anyone anywhere else could make a more cost-competitive chip.

While Nvidia and its CUDA software are currently ruling the day, all of the well-funded cloud giants are certainly trying to change that. Jassy went on to say that Amazon’s current Trainium2 chip generation offers 30%-40% better price performance relative to Nvidia’s current instances, which likely means the H100. While Nvidia is working on ramping its new Blackwell chip, Amazon is also at work on Trainium3.

In the interview, Jassy noted, “If you sat in meetings with the AWS team right now, they feel like it’s their responsibility and their mission to make the cost of A.I. meaningfully less than today.”

Meanwhile, at the Alphabet event, management unveiled what looks to be an incredibly powerful new in-house chip called Ironwood. Ironwood is the company’s seventh-generation tensor processing unit (TPU), which Alphabet uses for its own internal AI workloads.

The new chip is designed to run in either 256-chip servers or in massive 9,216-chip clusters, which Google plans to use not only for its Gemini models but also for its cloud clients that wish to train their own models. Each Ironwood chip can handle six times the memory per chip of the prior generation of TPUs. And the performance is massive — in that large cluster configuration, Google Ironwood has peak inference throughput of 4,614 teraflops. That’s 10 times faster than the fifth-generation TPU and 5 times faster than the sixth generation.

Nvidia will be getting some serious competition

No doubt, Nvidia has a multiyear head start in making AI chips, and its CUDA software acts as a bit of a moat, at least for now. However, Amazon and Google are massively powerful companies that can also produce chips at wafer cost, whereas Nvidia currently makes a 75% gross margin. That means Nvidia’s chips cost 4 times what in-house chips cost the cloud companies.

Given both companies’ missions to lower the costs of AI and cut out Nvidia as the middleman, so to speak, Nvidia’s revenue could eventually slow down, or its margins could come down as well. But that’s only if Amazon, Google, and other cloud giants prove successful in designing and implementing their own silicon and making it easy for AI developers to use.



Source link

Scroll to Top