Sekėjai

Ieškoti šiame dienoraštyje

2025 m. spalio 7 d., antradienis

OpenAI Reaches Huge Pact With AMD For Chips --- Multibillion-dollar partnership on AI data centers will challenge Nvidia

 

“OpenAI and chip-designer Advanced Micro Devices announced a multibillion-dollar partnership to collaborate on AI data centers that will run on AMD processors, one of the most direct challenges yet to industry leader Nvidia.

 

Under terms of the deal, OpenAI committed to purchasing 6 gigawatts worth of AMD's chips, starting with the MI450 chip next year. The ChatGPT maker will buy the chips either directly or through its cloud computing partners.

 

AMD chief Lisa Su said in an interview Sunday that the deal would result in tens of billions of dollars in new revenue for the chip company over the next half-decade.

 

The companies didn't disclose the plan's expected overall cost, but AMD said it costs tens of billions of dollars per gigawatt of computing capacity.

 

As part of the deal, OpenAI also will receive warrants for up to 160 million AMD shares, about 10% of the chip company, at 1 cent per share, awarded in phases, if OpenAI hits certain milestones for deployment. AMD's stock price also has to increase for the warrants to be exercised.

 

Shares of AMD soared Monday, rising 24% to close at $203.71 a share.

 

The deal is AMD's biggest win in its quest to disrupt Nvidia's dominance among artificial-intelligence semiconductor companies. AMD's processors are widely used for gaming, in personal computers and traditional data-center servers, but it hasn't made as much of a dent in the fast-growing market for the pricier supercomputing chips needed by advanced AI systems.

 

OpenAI plans to use the AMD chips for inference functions, or the computations that allow AI applications such as chatbots to respond to user queries. As the profusion of large language models and other tools has picked up, demand for inference computing has leapt, OpenAI Chief Executive Officer Sam Altman said in a joint interview with Su.

 

"It's hard to overstate how difficult it's become" to get enough computing power, Altman said. "We want it superfast, but it takes some time."

 

The two CEOs said the deal would tie their companies together and give them incentives to commit to the AI infrastructure boom. "It's a win for both of our companies, and I'm glad that OpenAI's incentives are tied to AMD's success and vice versa," Su said.

 

Nvidia remains the preferred chip supplier among AI companies, but it is also facing competition from almost every corner of the market. Cloud giants such as Google and Amazon design and sell their own AI chips, and Open-AI recently signed a $10 billion deal with Broadcom to build its own in-house chip. Nvidia is releasing its highly-anticipated Vera Rubin chip next year, promising it will be more than twice as powerful as its current generation, known as Grace Blackwell.

 

OpenAI will begin using 1 gigawatt worth of the MI450 chip starting in the second half of next year to run its AI models.

 

Altman said the fate of many companies would increasingly be linked as demand for AI services, along with the computing and infrastructure needs that accompany them, is set to far outstrip supply.

 

"We are in a phase of the build-out where the entire industry's got to come together and everybody's going to do super well," Altman said. "You'll see this on chips. You'll see this on data centers. You'll see this lower down the supply chain."

 

Altman has been on a dealmaking spree in the past month, at times using creative financing structures to secure hundreds of billions of dollars worth of computing power. His aim is to lock up enough data-center capacity to win the race to develop superintelligence, or AI systems that rival humans for reasoning and intuition.

 

In late September, Nvidia announced that it would invest $100 billion in OpenAI during the next decade. Under the terms of the circular arrangement, OpenAI plans to use the cash from Nvidia to buy Nvidia's chips and deploy up to 10 gigawatts of computing power in AI data centers. The deal highlighted how the market's seemingly endless enthusiasm for Nvidia's stock is providing a financial backstop for the entire AI market.

 

The Nvidia deal isn't completed yet. The two companies have signed a letter of intent and have yet to disclose specific terms in a regulatory filing.

 

OpenAI and AMD described the announcement as "definitive," and AMD filed details of the partnership with securities regulators on Monday.

 

Su told investors in a call Monday that the deal was "a clear validation of our technology road map" that would give AMD tens of billions of dollars in revenue by 2027.

 

When asked about the unique warrant structure to the deal, Su called it "pretty innovative" and added, "I wouldn't say it came lightly."

 

Altman also recently signed a $300 billion megadeal with Oracle, the software company founded by Larry Ellison, to purchase another 4.5 gigawatts of cloud-computing power over five years.

 

"The thing you have to believe if you are us, or our whole industry, is that given everything we're seeing in our research and in our product metrics, is that the demand for AI at a reasonable revenue rate is going to continue to steeply increase," Altman said.

 

The dealmaking frenzy, which has drawn much of the technology industry into the maelstrom, has contributed to growing fears that a bubble is building in AI infrastructure. Companies such as OpenAI, Meta, Alphabet and Microsoft are spending money on chips, data centers and electrical power at levels that dwarf the largest build-outs in history, including the 19th century railroad boom and the construction of the modern electrical and fiber-optic grids.

 

"I'm far more worried about us failing because of too little compute than too much," said Greg Brockman, OpenAI's president and co-founder.

 

In late September, OpenAI and Oracle executives gathered in Abilene, Texas, to lay out their vision to spend trillions of dollars on AI data centers that they said would help satisfy the explosive demand for ChatGPT, which has 700 million weekly users.

 

It is unclear how OpenAI will pay for the hundreds of billions of dollars of infrastructure investments to which it has committed. The startup recently told investors that it was likely to spend around $16 billion to rent computing servers alone this year, and that the number could rise to $400 billion in 2029, The Wall Street Journal reported. OpenAI is on pace to generate $13 billion in revenue this year, and Altman said the company is focusing more on profitable tasks that can be accomplished using its tools.” [1]

 

1. OpenAI Reaches Huge Pact With AMD For Chips --- Multibillion-dollar partnership on AI data centers will challenge Nvidia. Whelan, Robbie; Berber, Jin.  Wall Street Journal, Eastern edition; New York, N.Y.. 07 Oct 2025: A1.

Komentarų nėra: