Tech’s splurge on AI chips has companies in ‘arms race’ that’s forcing more spending

Technology

In this article

Meta founder and CEO Mark Zuckerberg speaks during the Meta Connect event at Meta headquarters in Menlo Park, California, on Sept. 27, 2023.
Josh Edelson | AFP | Getty Images

Meta CEO Mark Zuckerberg has been assembling a large stockpile of Nvidia chips, spending billions of dollars so his company can develop and train advanced artificial intelligence models.

But even he says the AI hype may be driving too much investment.

“I think that there’s a meaningful chance that a lot of the companies are overbuilding now and that you look back and you’re like, oh, we maybe all spent some number of billions of dollars more than we had to,” Zuckerberg said on a podcast this week with Bloomberg’s Emily Chang.

He’s not the only one expressing that sentiment.

On Alphabet’s earnings call on Wednesday, CEO Sundar Pichai said his company may well be spending too much money on AI infrastructure, which largely consists of Nvidia’s graphics processing units (GPUs). But he sees little choice.

“When we go through a curve like this, the risk of underinvesting is dramatically greater than the risk of overinvesting for us here,” Pichai said.

In addition to Meta and Alphabet, Nvidia is racking up business from Microsoft, Amazon, Oracle and Tesla, which have all publicly pronounced that AI investment is a central priority for this year and the foreseeable future. Nvdia’s revenue has more than tripled for three straight quarters and is expected to more than double in the current period.

Alphabet and Tesla highlighted their AI buildout costs on earnings calls this week, and investors can expect to hear more next week, when Microsoft, Amazon and Meta report results.

Meta debuted its latest Llama AI model on Tuesday. The model, dubbed Llama 3.1, comes in three different versions, with one variant being the biggest and most capable AI model from Meta to date. Meta is sticking with open source, which means the technology can be accessed for free by outside developers, even as the company pours money into the underlying infrastrucure.

Zuckerberg said on the podcast with Chang that companies are “making a rational decision” in their AI investments despite the exorbitant costs.

“Because the downside of being behind is that you’re out of position for like the most important technology for the next 10 to 15 years,” Zuckerberg said.

The way Pichai sees it, even if Alphabet is investing too much, the infrastructure is “widely useful for us.”

‘A threat and an opportunity’

Nvidia shares are up 131% this year after soaring 239% in 2023. The company is now valued at close to $3 trillion, behind only Apple and Microsoft, though it briefly surpassed the two of them in market cap in June.

Nvidia gets more than 40% of its revenue from Microsoft, Amazon, Google, and Oracle, which all need a hefty dose of GPUs for their public cloud offerings. While those are some of the most well-capitalized companies on the planet, there’s some concern brewing among investors about the massive stockpiling.

David Cahn, a partner at venture firm Sequoia, wrote in a blog post last week that the dynamic driving the spending is competitive and follows game theory dynamics, creating a “cycle of competitive escalation.”

“The cloud giants see AI as both a threat and an opportunity and do not have the luxury to wait and see how the technology evolves,” Cahn wrote. “They must act now.”

Cahn calculated that in the technology industry, there needs to be $600 billion in annual AI revenue to justify all the money that’s been spent on data centers and chips.

On Wednesday, Cahn followed up by saying that Zuckerberg’s and Pichai’s comments about limiting downside bolstered his theory.

“Google and Meta CEOs both out in last 24 hours now agreeing with my AI Arms Race narrative: That AI CapEx is driven by game theory and FOMO vs. actual revenue / usage,” Cahn posted on LinkedIn.

Jensen Huang, co-founder and chief executive officer of Nvidia Corp., displays the new Blackwell GPU chip during the Nvidia GPU Technology Conference on March 18, 2024. 
David Paul Morris/Bloomberg via Getty Images

Nvidia says demand will remain strong through its newest generation of AI chips, called Blackwell, which will start to ship later this year. But it’s starting to address investor questions about return on investment as growth inevitably slows due to historically difficult comparisons.

Colette Kress, Nvidia’s finance chief, told investors in May that the company had calculated that when a cloud provider spends $1 on an Nvidia-based server, it can rent it out for $5. Goldman Sachs analysts said in a recent note that Nvidia is looking to share these types of data points to instill confidence in investors.

Tesla CEO Elon Musk said on his company’s earnings call on Tuesday that “demand for Nvidia hardware is so high that it’s often difficult to get the GPUs.” Tesla said capital expenditures on AI in the quarter amounted to $600 million, as the company invests in autonomous driving and humanoid robots.

Musk said Tesla is focusing on developing its own Dojo supercomputer because Nvidia chips are so pricey and so hard to get.

I think we kind of have no choice because the demand for Nvidia is so high,” Musk said. “And it’s obviously their obligation essentially to raise the price of GPUs to whatever the market will bear, which is very high.”

WATCH: Investors should look forward to earnings catalysts for Tesla


Articles You May Like

Teacher and several students injured in Croatia knife attack
Volkswagen may not close its factories in Germany after all
Tesla looks for new comms manager, is a PR dept coming back?
Car production falls for ninth month in a row – after worst November since 1980
Toyota wants to make owning an EV more affordable, but where are the choices?