Trends in AI accelerator chip business for High Bandwidth Memory (HBM) market

A recent TrendForce report suggests significant developments in the High Bandwidth Memory (HBM) market and the AI accelerator chip industry for 2023 and beyond.
Generative AI chips NVIDIAHere are some key points to note:

HBM2e Dominance in 2023: The dominant product in the HBM market for 2023 is HBM2e. This memory technology is being employed by major players like NVIDIA (A100/A800), AMD (MI200), and various Cloud Service Providers’ self-developed accelerator chips.

Introduction of HBM3e in 2024: As the demand for AI accelerator chips grows, manufacturers are planning to introduce new HBM3e products in 2024. HBM3 and HBM3e are expected to become mainstream in the market next year.

Distinctions Between HBM Generations: The main differentiator between HBM generations is their speed. The industry faced confusion during the transition to HBM3, leading to the subdivision of HBM3 into two categories based on speed. One category comprises HBM3 running at speeds between 5.6 to 6.4 Gbps, while the other features the faster 8 Gbps HBM3e, which goes by several names, including HBM3P, HBM3A, HBM3+, and HBM3 Gen2.

Manufacturers and HBM3e Development: The development status of HBM3e varies among the major manufacturers. SK Hynix and Samsung started with HBM3 and are expected to sample HBM3e in Q1 2024. On the other hand, Micron chose to skip HBM3 and directly develop HBM3e.

HBM3e Features: HBM3e will be stacked with 24Gb mono dies and will have a capacity of 24GB under the 8-layer foundation. NVIDIA’s GB100, set to launch in 2025, is anticipated to feature HBM3e.

Cost Considerations: While NVIDIA continues to dominate the market in terms of AI server accelerator chips, the high costs associated with their H100/H800 GPUs have increased the total cost of ownership. This has led Cloud Service Providers to plan the development of their own AI accelerator chips.

Tech Giants’ Initiatives: Tech giants Google and Amazon Web Services (AWS) have made significant progress in developing their own AI accelerator chips. Google has its Tensor Processing Unit (TPU), while AWS has its Trainium and Inferentia chips. These companies are also working on next-generation AI accelerators that may utilize HBM3 or HBM3e technology.

Competition in the Market: Other Cloud Service Providers in North America and China are also conducting verifications related to AI accelerator chips, indicating a potential surge in competition in the AI accelerator chip market in the coming years.

Overall, the advancements in HBM technology and the development of AI accelerator chips indicate a highly competitive and dynamic market landscape in the near future. Manufacturers and CSPs are actively working on improving their offerings to meet the increasing demand for AI processing capabilities.

The artificial intelligence (AI) chip market is projected to surpass around $227.48 billion by 2032 from $16.86 billion in 2022 and growing at a CAGR of 29.72 percent from 2023 to 2032, according to new report study by Precedence Research.

Latest

More like this
Related

Samsung, Texas Instruments and Amkor to boost semiconductor facilities

The U.S. Commerce Department has finalized awards to Samsung...

SK Hynix to receive $458 mn grant from US

The U.S. Commerce Department has finalized a grant of...

What will be the growth of semiconductor industry in 2025?

The global semiconductor industry is set for a significant...

Micron granted $6.1 bn subsidy under the U.S. CHIPS Act

Micron Technology has been granted a $6.1 billion subsidy...