Sekėjai

Ieškoti šiame dienoraštyje

2023 m. spalio 11 d., trečiadienis

A.I. Could Soon Need As Much Electricity As an Entire Country


"Behind the scenes, the technology relies on thousands of specialized computer chips.

OpenAI's ChatGPT exploded onto the scene nearly a year ago, reaching an estimated 100 million users in two months and setting off an A.I. boom. Behind the scenes, the technology relies on thousands of specialized computer chips. And in the coming years, they could consume immense amounts of electricity.

A peer-reviewed analysis published Tuesday lays out some early estimates. In a middle-ground scenario, by 2027 A.I. servers could use between 85 to 134 terawatt hours (Twh) annually. That's similar to what Argentina, the Netherlands and Sweden each use in a year, and is about 0.5 percent of the world's current electricity use.

"We don't have to completely blow this out of proportion," said Alex de Vries, the data scientist who did the analysis. "But at the same time, the numbers that I write down -- they are not small."

The electricity needed to run A.I. could boost the world's carbon emissions, depending on whether the data centers get their power from fossil fuels or renewable resources.

In 2022, data centers that power all computers, including Amazon's cloud and Google's search engine, used about 1 to 1.3 percent of the world's electricity. That excludes cryptocurrency mining, which used another 0.4 percent, though some of those resources are now being redeployed to run A.I.

De Vries is a Ph.D. student at Vrije Universiteit Amsterdam and founded the research company Digiconomist, which publishes the Bitcoin Energy Consumption Index.

It's impossible to quantify A.I.'s energy use exactly, because companies like OpenAI disclose very few details, including how many specialized chips they need to run their software. So de Vries came up with a way to estimate electricity consumption using projected sales of Nvidia A100 servers -- the hardware estimated to be used by 95 percent of the A.I. market.

"Each of these Nvidia servers, they are power-hungry beasts," de Vries said.

He started with a recent projection that Nvidia could ship 1.5 million of these servers by 2027, and multiplied that number by its servers' electricity use: 6.5 kilowatts for Nvidia's DGX A100 servers, for example, and 10.2 kilowatts for its DGX H100 servers.

He noted several caveats. Customers might use the servers at less than 100 percent capacity, which would lower electricity consumption. But server cooling and other infrastructure would push the total higher.

Nvidia's lock on A.I.

As The Times has reported, Nvidia has built a commanding lead in A.I. hardware that is likely to persist for several years, though rivals are scrambling to catch up. The limited supply of Nvidia chips is a bottleneck for A.I. growth, leaving companies large and small scrambling to source their own supply of chips.

"There are a lot of dramatic statements about the rapid growth of A.I. and so on, but really it's about how quickly you can get those chips out there," said Benjamin Lee, a professor of electrical engineering and computer science at the University of Pennsylvania.

Nvidia said in an emailed statement that the company's specialized chips are better than other options, given that it would take many more conventional chips to accomplish the same tasks.

"Accelerated computing on NVIDIA technology is the most energy efficient computing model for A.I. and other data center workloads," the company said.

Some experts are urging companies to consider electricity consumption as they design the next generation of A.I. hardware and software. But that's a tough sell when companies are racing to improve their A.I. models as quickly as possible.

"Maybe we need to ideally slow a bit down to start applying solutions that we have," said Roberto Verdecchia, an assistant professor in the University of Florence's Software Technologies Lab. "Let's not make a new model to improve only its accuracy and speed. But also, let's take a big breath and look at how much are we burning in terms of environmental resources."" [1]

1. A.I. Could Soon Need As Much Electricity As an Entire Country: [Foreign Desk]. Erdenesanaa, Delger.  New York Times, Late Edition (East Coast); New York, N.Y.. 11 Oct 2023: A.8.


Komentarų nėra: