Nvidia Asked SK Hynix To Advance Next-Gen AI Memory Production

SK Group chairman Chey Tae-won speaks at the SK AI Summit 2024 in Seoul in November 2024. Image credit: SK Group

SK Hynix says Nvidia chief executive Jensen Huang asked if production of next-gen HBM4 memory could be advanced, amidst explosive AI demand

Getting your Trinity Audio player ready...

Jensen Huang, the chief executive of artificial intelligence (AI) chipmaker Nvidia, asked SK Hynix to advance the delivery of next-generation high-bandwidth memory (HBM) chips by six months, the memory chip producer said, emphasising the explosive demand for the building blocks of AI infrastructure.

“The last time I met with Huang, he asked me if we could supply HBM4 six months earlier than the date we have agreed upon,” said SK Group chairman Chey Tae-won at an event in Seoul.

“I asked the SK Hynix CEO whether it’s possible, and he said he will try, so we are working to move the date up by six months.”

He joked that he is “a bit nervous” to meet Huang again because “we’re worried he might ask us to speed it up even further”.

Nvidia chief executive Jensen Huang. Image credit: Nvidia
Nvidia chief executive Jensen Huang. Image credit: Nvidia

Memory advances

Chey’s remarks at SK AI Summit 2024 show how producers are doubling down to ramp production of key AI infrastructure, including GPU accelerator chips and HBM memory.

Nvidia is the leading producer of GPUs for the AI industry, with more than 80 percent market share, while SK Hynix has become a key producer of HBM memory chips as Samsung has lagged in their production.

SK Hynix began manufacturing HBM3E, the current cutting-edge of the technology, in September, and is aiming to produce 12-layer HBM4 chips next year, with 16-layer HBM4 chips to follow in 2026.

In a pre-recorded video clip Huang said HBM had enabled “super Moore’s law” with AI accelerators.

Moore’s law is the observation that the number of transistors in an integrated circuit doubles about every two years, often used as a way of referring to the quickly increasing power of computer chips.

‘Super Moore’s law’

“When we moved from coding to machine learning, it changed the computer architecture fairly profoundly. And the work that we did with HBM memories has really made it possible for us to achieve what appears to be super Moore’s law,” Huang said.

“We wish that we got more bandwidth with lower energy. So the roadmap that SK Hynix is on is super aggressive and is super necessary.”

Chey also identified challenges faced by the AI industry, including the lack of “killer use cases” and revenue models to recoup heavy infrastructure investments, limited chip manufacturing capacity and the necessity of constnatly supplying AI systems with high-quality human-generated data.

He also spoke about the heavy energy requirements of AI data centres, saying SK Hynix has invested in gas turbines with carbon-capturing technology and small, modular nuclear reactors.