“Meta Platforms is in talks to use chips made by Google in its artificial-intelligence efforts, a step toward diversifying away from its reliance on Nvidia, according to people familiar with the matter.
A deal could be worth billions of dollars, but the talks are continuing and may not result in one.
It is still up in the air whether Meta would use the chips, known as tensor processing units, or TPUs, to train its AI models or to do inference, one of the people said. Inference, the process a trained model uses to generate the response to a query, requires less computational power than training.
Google has been working for years to refine its chips and scale up that part of its business. A significant deal with Meta would represent a potential crack in Nvidia's market dominance for Google and other chip makers to exploit. Nvidia's shares ended down 2.6% on Tuesday.
Google said its Google Cloud is experiencing "accelerating demand" for both its custom TPUs and Nvidia GPUs and that the company is "committed to supporting both, as we have for years."
Nvidia and Meta declined to comment.
"The biggest story in AI right now is that Google and Nvidia are being extraordinarily competitive," said Adam Sullivan, chief executive of data-center operator Core Scientific. "They're in a race to secure as much data-center capacity as they can."
Both Nvidia and Google, a unit of Alphabet, are courting potential customers and offering them financing arrangements to help ease the purchase of their chips.
"They don't care about how much revenue they generate," Sullivan said. "This is about who gets to [artificial general intelligence] first."
The tech news site The Information previously reported on the talks between Google and Meta on Monday night.
Tuesday, after its shares fell in early trading, Nvidia posted a statement on X: "We're delighted by Google's success -- they've made great advances in AI and we continue to supply to Google. NVIDIA is a generation ahead of the industry -- it's the only platform that runs every AI model and does it everywhere computing is done."
Google first began using its TPU chips about a decade ago. In 2018, it began offering its cloud customers the opportunity to use TPUs for their training and inference needs.
More recently, Google has used the chips to train and operate its own Gemini large language models and sold them to customers including Anthropic, developer of the Claude AI model. Last month, Anthropic announced that starting next year it would spend tens of billions of dollars to buy up to one million Google TPUs -- enough to supply roughly 1 gigawatt of computing capacity -- to power more AI research and help Anthropic serve rising customer demand for its enterprise AI tools.” [1]
1. U.S. News: Meta in Talks to Use Google Chips, a Test for Nvidia. Bobrowsky, Meghan; Blunt, Katherine; Whelan, Robbie. Wall Street Journal, Eastern edition; New York, N.Y.. 26 Nov 2025: A2.
Komentarų nėra:
Rašyti komentarą