“As the artificial-intelligence race intensifies, the speed with which the world's biggest chip companies roll out each successive generation of computing products is quickening.
On Monday at the Consumer Electronics Show in Las Vegas, Nvidia Chief Executive Jensen Huang unveiled the company's newest AI server systems, known as Vera Rubin, which go on sale in the second half of this year.
Usually, Nvidia details the specs and capabilities of its latest chips at its spring developer conference in Silicon Valley.
This year, Huang said, the complexity of computing required by AI and the immense demand for advanced processors to train and operate models has prompted the semiconductor industry to move faster.
"The amount of computing necessary for AI is skyrocketing," Huang told the Las Vegas audience, pacing the stage wearing a shiny black jacket in a crocodile-scale pattern. "The race is on for AI. Everyone is trying to get to the next frontier."
Multiple paradigm shifts in computing are behind the surge in demand, Huang said. Inference, or the process by which AI models respond to user prompts, "is now a thinking process," the CEO said, and new models need to be trained on increasingly immense amounts of data to teach AI tools how to think.
Nvidia has long argued that the next phase of AI is what the company refers to as the "omniverse" -- a type of model training that allows AI to use simulations of reality to learn how to navigate real-world situations. For example, autonomous vehicles guided by AI models can be refined more quickly using simulations of thousands of hours doing real-world training.
The company's new Vera Rubin servers -- named for an mid-20th century American astronomer who made groundbreaking discoveries in observing how heavenly bodies move -- are designed to be able to handle the enormous computing loads needed to create those simulations and use them in model-training.
Nvidia tested the new system assuming that developers will soon be using up to 10 trillion data parameters to train AI models. Using Rubin graphics processing units -- the chips that accelerate the millions of computations required in training -- developers can train such a model in a month using one-quarter the number of chips they would need if they were using Nvidia's previous generation of GPUs, known as Blackwell.
For inference, Rubin delivers a 10-fold reduction in cost, compared with Blackwell, the company said.
Nvidia has also integrated a host of connectivity and memory-storage products into the new system to speed up computing capabilities, which Huang said had made Nvidia into the world's largest networking hardware company, in addition to the biggest maker of computing semiconductors.
"Our job is to create the entire stack so you can create the applications" that change the world, Huang told the audience.
Nvidia is also promoting a host of new programming libraries and other software products that make it easier to use its chips for advanced computing tasks, especially in robotics, autonomous vehicles and other examples of what Huang called "physical AI."
Daniel Newman, CEO of Futurum Group, an AI research firm, described Vera Rubin as an "incredible generational leap," based on the specifications Huang described Monday, and said that unveiling of the chip this early in the year will send a signal to the market that the ramp-up of production of the Vera Rubin system is on time and that the servers will come to market quickly.
Lisa Su, chief executive of Advanced Micro Devices, a much smaller rival designer of AI chips, also emphasized physical AI in her keynote Monday in Las Vegas.
AMD announced a partnership with Italia robotics firm Generative Bionics that involves using AMD's chips to train a humanoid industrial robot named GENE.01 that was trained using advanced AI simulations. The company also unveiled its latest AI chips, known as the Instinct MI440X, which launch later this year and are expected to be AMD's most legitimate competition to Nvidia yet.” [1]
1. Nvidia's New Chips Arrive Early --- Huang says: 'The race is on for AI. Everyone is trying to get to next frontier'. Whelan, Robbie. Wall Street Journal, Eastern edition; New York, N.Y.. 06 Jan 2026: B4.
Komentarų nėra:
Rašyti komentarą