"Hopper Price Is Actually Rising"
Agentic AI Development Accelerating
HBM and Memory Supply "Progressing Well"
Jensen Huang, Chief Executive Officer (CEO) of Nvidia, dismissed some industry concerns about an "AI bubble," stating, "The throughput of AI chips is increasing tenfold."
On the 6th (local time), during a press Q&A session at the Fontainebleau Las Vegas, where CES 2026 is being held, Huang made these remarks. He explained that Nvidia's revenue is increasing even more rapidly than the previously mentioned 500 billion dollars in AI chip sales. He added, "The price of Hopper is actually rising," and emphasized, "As a result, this year will be an exceptionally good year."
He highlighted that Nvidia's chip architectures, from Hopper to Blackwell and Rubin, are continuously advancing. Huang explained, "The size of models is increasing tenfold every year, and the number of tokens generated by inference models is growing fivefold. The throughput has improved by about ten times from Hopper to Blackwell, and again from Blackwell to Rubin."
He further stated that the pace of advancement will accelerate even more with the development of agentic artificial intelligence (autonomous work) systems. Huang underscored the massive demand for computing power, noting that last year alone, 150 billion dollars were invested in AI companies.
Jensen Huang, Chief Executive Officer (CEO), spoke during a press Q&A session on the 6th (local time) at the Fontainebleau Las Vegas, where CES 2026 is being held in Las Vegas, USA. Photo by Park Junyi.
Addressing concerns that the recent surge in demand for AI data centers could lead to a shortage of memory chips and drive up GPU prices, Huang mentioned the sixth-generation high-bandwidth memory (HBM), "HBM4," to dispel such worries. HBM4 entered mass production for the first time in the world by SK Hynix in September last year, and Samsung Electronics is expected to begin full-scale mass production early this year.
Huang stated, "We are one of the largest direct purchasers of memory in the world. We work with all memory suppliers, and some of that involves HBM. We are the first consumers of HBM4." He added, "Everything has been developed for HBM, and fortunately, we are the only users. For a while, we will enjoy the advantage of being the primary and sole consumers of HBM4, and our demand is very high."
Having announced the supply plans for the next-generation AI chip "Vera Rubin" in the second half of the year during a special keynote the previous day, Huang also expressed confidence in memory chip suppliers. He emphasized, "All factories and suppliers are ready and performing excellently. We are progressing very well."
Huang reiterated that Nvidia's partners contribute not only to HBM but also to other types of memory supply. Nvidia receives not only HBM but also GDDR and LPDDR from Samsung Electronics, SK Hynix, and Micron.
Meanwhile, at CES 2026, Huang took the stage at the "2026 IEEE Medal of Honor" award ceremony held at the Fontainebleau Las Vegas as the recipient. He was awarded the IEEE Medal of Honor, the most prestigious award given since 1917 to individuals who have made outstanding contributions to humanity in engineering and science. After receiving the award, Huang credited his colleagues, saying, "This award belongs not to me personally, but to the colleagues who have shared the lifelong work that is Nvidia."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

