SK Hynix, a leading memory chip manufacturer, announced that its high-bandwidth memory (HBM) chips, crucial components in AI chipsets, are already sold out for the year and nearly exhausted for 2025. This surge in demand comes as businesses aggressively expand their artificial intelligence services.
The company, which supplies Nvidia, the world’s second-largest AI chip manufacturer, revealed plans to roll out samples of its latest HBM chip, the 12-layer HBM3E, starting May, with mass production slated for the third quarter.
Speaking at a news conference, SK Hynix’s Chief Executive Officer, Kwak Noh-Jung, highlighted the rapid growth trajectory of the HBM market, projecting an annual demand growth of approximately 60 percent in the mid-to long-term.
Until March, SK Hynix held the exclusive position as Nvidia’s sole HBM chip supplier, but industry analysts note a trend among major AI chip purchasers to diversify their suppliers for better operational flexibility, as Nvidia dominates around 80 percent of the AI chip market.
Notably, Micron, a U.S. competitor to SK Hynix, also reported its HBM chips as sold out for 2024, with the majority of its 2025 supply already allocated. Micron plans to distribute samples of its 12-layer HBM3E chips to customers in March.
Jeff Kim, head of research at KB Securities, attributed the increasing demand for ultra-high-performance chips like the 12-layer ones to the accelerated upgrade in AI functions and performance, Reuters news report said.
Samsung Electronics, another key player in the semiconductor industry, disclosed plans to commence production of its HBM3E 12-layer chips in the second quarter. The company anticipates a significant surge in HBM chip shipments this year, having concluded supply discussions with its clientele.
Last month, SK Hynix unveiled a substantial investment plan, including a $3.87 billion project to establish an advanced chip packaging plant in Indiana, USA, equipped with an HBM chip line. Additionally, a 5.3 trillion won ($3.9 billion) investment was announced for a new DRAM chip factory in South Korea, with a particular focus on HBM technology.
Kwak emphasized that the investment strategy in HBM chips deviates from conventional memory chip industry patterns by first ensuring demand before increasing capacity. Justin Kim, SK Hynix’s head of AI infrastructure, predicts that by 2028, AI-oriented chips, including HBM and high-capacity DRAM modules, will represent 61 percent of total memory volume in terms of value, a substantial increase from about 5 percent in 2023.
In a recent post-earnings conference call, SK Hynix cautioned of a potential shortage of regular memory chips for smartphones, personal computers, and network servers by the end of the year if demand for tech devices surpasses expectations.