close
close
migores1

Nvidia CEO details AI infrastructure ROI with Goldman Sachs

“What Nvidia is really good at is creating new markets,” Nvidia CEO Jensen Huang told Goldman Sachs CEO David Solomon at a technology conference hosted by the investment bank on Wednesday. Nvidia investors and AI stakeholders continue to hang on Huang’s every word since Nvidia’s less-than-blockbuster (though only relative to previous blowouts) earnings report in August.

While Nvidia’s chips may seem ubiquitous today, Huang said the company had to spread the gospel of GPU-based computing “one industry at a time.” That part is clear to investors, as Nvidia’s staggering revenue and total AI infrastructure spending estimates exceeding $1 trillion provide ample evidence.

Solomon asked the most important question in technology today: Where is the ROI from all this investment in AI infrastructure?

Huang has asked similar questions before, but on Wednesday he offered more math in a two-pronged answer.

First, in the age of generative AI, Huang said, cloud providers that buy Nvidia (and other) GPUs and lease them to tech companies make $5 for every dollar they spend.

“Everything is sold. And so the demand for this is just incredible,” he continued.

Second, for customers of those cloud providers, who essentially rent computing time on GPUs, Huang said that if companies convert traditional data processing work to accelerated computing methods, the incremental cost can double in true, but the work would be done 20 times faster. “So you’re getting 10-fold savings,” Huang said, adding, “It’s not uncommon to see this ROI.”

The data center of the future

Huang’s general answer to the question of when AI’s ROI will be evident urges companies to “accelerate everything.”

“Any large-scale processing of any large amounts of data — you have to speed that up,” he said.

Huang argued that upgrading existing data centers to “accelerated computing,” or the parallel computing that Nvidia’s GPUs and other AI chips enable, is inevitable.

With price tags for some models reaching into the millions, Nvidia’s server racks sound expensive, Huang said, but they replace “thousands of nodes” of traditional computing. Smaller, denser, liquid-cooled data centers are the future, he said.

“Densified” data centers will be more energy efficient and cost efficient, Huang said, adding, “That’s the next ten years.”

Nvidia’s stock price rose less than an hour after Huang’s conversation with Solomon began.

Even if Nvidia’s claims that AI computing is more energy efficient are true, the AI ​​boom is expected to put massive strain on power grids.

Huang acknowledged that the stakes are high for everyone involved in the AI ​​boom.

“The demand is so high that the delivery of our components and our technology and our infrastructure and software is really exciting for people because it directly affects their income,” Huang said.

Related Articles

Back to top button