The HBM market is often considered the backbone of artificial intelligence computing, with revenue expected to double by 2025 as the industry moves toward next-generation chips. The HBM market has ushered in new hope, catalyzing significant growth in revenue. High-bandwidth memory is a market segment that has seen sharp growth in demand recently, especially with the advent of the artificial intelligence boom, and companies such as Samsung and SKhynix have also entered this field.
Judging from the current landscape, Samsung, SKhynix and Micron are the three "giants" in the HBM market, and they will also dominate the market for some time to come. This is due to the continued development of these companies, especially the development of next-generation processes such as HBM4, which has received great attention from the industry.
In view of this, market research organization Gartner released a report stating that the HBM market size will reach US$4.976 billion by 2025, almost quadrupling compared with 2023. This estimate is based entirely on current and expected industry demand and contains no surprises, as the key area where HBM sales are the largest is its use in artificial intelligence GPUs. As reported many times in the past, the sudden rise in AIGPU demand has created a shortage of HBM in the market, as HBM is the main component that makes up the AI throttle.
People are also very optimistic about the future of HBM, because according to previous reports, the industry is indeed moving towards newer standards, and standards such as HBM3e and HBM4 will be widely adopted by manufacturers.
NVIDIA has a lot planned for customers in 2024. The company has released the H200 Hopper GPU, which is expected to achieve mass adoption next year, and will later release the B100 "Blackwell" AIGPU, both of which will be based on HBM3e memory technology. A similar situation is emerging in the AMD camp, whose next-generation AMD Instinct GPU will use the newer HBM type for the first time.