Nvidia CEO Jensen Huang stated that the next generation of artificial intelligence will require 100 times more computational power than previous models. He attributes this to new reasoning approaches that focus on strategizing the best methods to answer questions step-by-step.
“The amount of computation necessary to do that reasoning process is 100 times more than what we used to do,” Huang told CNBC’s Jon Fortt in an interview following the chipmaker’s fiscal fourth-quarter earnings report.
Huang specifically mentioned models like DeepSeek’s R1, OpenAI’s GPT-4, and xAI’s Grok 3 as examples employing this reasoning process.
Nvidia’s recent financial results exceeded analysts’ expectations, with revenue soaring 78% year-over-year to $39.33 billion. Revenue from the data center segment, which includes Nvidia’s market-leading graphics processing units (GPUs) used for AI workloads, experienced a 93% increase to $35.6 billion. This segment now contributes over 90% of the company’s total revenue.
Despite the positive financial performance, Nvidia’s stock hasn’t fully recovered from a 17% drop on January 27th, the largest decline since 2020. This downturn was fueled by concerns stemming from the Chinese AI lab DeepSeek, which raised questions about potentially achieving superior AI performance with significantly reduced infrastructure costs.
In the Wednesday interview, Huang rejected those concerns, arguing that DeepSeek’s popularity among reasoning models would actually necessitate more chips. “DeepSeek was fantastic,” Huang said. “It was fantastic because it open sourced a reasoning model that’s absolutely world class.”
Due to export controls enhanced by the Biden administration, Nvidia is currently restricted from conducting business in China. Huang noted that the company’s revenue percentage in China has decreased by about half due to these restrictions, also adding that there are mounting competitive pressures within the country, including from Huawei.
Huang believes that developers will likely find methods of circumventing the export controls through software modifications, regardless of whether they are developing for supercomputers, personal computers, phones, or game consoles. “Ultimately, software finds a way,” he said. “You ultimately make that software work on whatever system that you’re targeting, and you create great software.”
According to Huang, Nvidia’s GB200, which is sold in the United States, can generate AI content 60 times faster than the versions of the company’s chips sold to China under export controls.
Nvidia relies on billions of dollars in annual infrastructure spending from the world’s largest tech companies to generate a substantial portion of its revenue. The company has been the primary beneficiary of the AI boom, with revenue growth that more than doubled for five consecutive quarters through mid-2024 before slowing slightly, demonstrating its market leadership.