Sourcengine
Sourcenginebreadcrumb separatorResource Articlesbreadcrumb separatorIndustry Newsbreadcrumb separator
A Crash Course on the Meteoric Rise of High Bandwidth Memory in AI

A Crash Course on the Meteoric Rise of High Bandwidth Memory in AI

Image of machine assembling components on a pcb board

The demand for efficient and high-performance computing solutions continues to grow in the rapidly evolving landscape of artificial intelligence (AI). One technology that has emerged as a game-changer in this area is high bandwidth memory (HBM). Memory enables AI to learn from past experiences, make informed decisions, and adapt to new situations, making it essential to create better AI models such as generative AI.

Memory helps AI better understand the data it processes, allowing it to recognize patterns faster and achieve greater interactions by leveraging human speech with users. HBM offers unmatched bandwidth and energy efficiency compared to traditional memory structures. As an emerging technology, HBM’s rise in AI applications is reshaping AI-driven tasks and the market.  

Understanding High Bandwidth Memory’s Benefits in Artificial Intelligence

As defined by Semiconductor Engineering, high bandwidth memory or HBM “is standardized stacked memory technology that provides very wide channels for data, both within the stack and between the memory and logic.”

“An HBM stack can contain up to eight DRAM modules connected by two channels per module. Current implementations include up to four chips, roughly the equivalent of 40 DDR cores in a fraction of the space.”

Adopted in 2013 by the Joint Electron Device Engineering Council Solid State Technology Association (JEDEC), HBM is a relatively new type of memory chip that offers low power consumption and ultra-wide communication lanes. Its vertically stacked memory chips are interconnected by microscopic wires called “through-silicon vias” (TSVs). This gives HBM reduced power consumption compared to other memory technologies such as DDR4 and GDDR6.  

HBM’s stacked architecture allows it to achieve high-speed data transfers through its reduced distance, making it an ideal component for memory-intensive applications like AI. As of January 2016, major memory manufacturers Samsung Electronics and SK Hynix are commercially producing HBM chips, with others expected to follow.

The latest generation of HBM or HBM3, announced in 2022, offers further reduced power consumption, a small form factor, and high performance. The HBM3 offers “2.5D packaging with a wider interface at a lower clock speed (compared to GDDR6) to deliver higher overall throughput at a higher bandwidth-per-watt efficiency for AI/ML and high-performance computing (HPC) applications.”

According to TrendForce, “2024 will mark the transition from HBM3 to HBM3e, and SK Hynix is leading the pack with HBM3e validation in the first quarter of this year.” SK Hynix remains ahead of its competitors, Samsung and Micron, on this front.  

Today, data transfer is at a critical bottleneck within AI applications, especially those that utilize deep learning. AI models use vast amounts of data that must be processed parallel to meet growing computational demands. These requirements necessitate high bandwidth that traditional memory architectures often struggle to fulfill. Slow data movement frequently contributes to performance limitations and increased power consumption, which HBM usually solves.

HMB’s capabilities allow AI accelerators to access data more efficiently, enabling faster model training and inference. This leads to performance improvements and reduced deployment time for AI applications, making it a popular component choice for AI developers. Specifically, HBM can be immensely beneficial within deep learning training, inference acceleration, and high-performance computing (HPC). These AI applications require extensive memory bandwidth and rapid data retrieval, which HBM provides.  

The popularity of HBM is quickly reshaping the memory market, and it is not always beneficial.

The Future of High Bandwidth Memory and its Challenges

The demand for high-performance memory solutions like HBM will only grow as AI advances. Future iterations of HBM are expected to push the boundaries further with increased stack heights, improved energy efficiency, and enhanced reliability.

However, the widespread adoption of HBM in AI applications is not without challenges. The HBM market is forecasted to grow at a compound annual growth rate (CAGR) of 31.3% during the forecast period of 2023-2031. The rapid growth rate is primarily driven by the rising demand for AI and machine learning, the expansion of data centers, enhanced graphics performance, and 5G technology rollouts.  

With these factors, HBM’s share of the total DRAM capacity is estimated to rise from 2% in 2023 to 3% in 2024 and a considerable 10% by 2025. This will allow HBM to account for more than 20% of the total DRAM market value in 2024, exceeding 30% in 2025. HBM's price continues to rise as its popularity grows. Price negotiations for 2025 have already begun, with suppliers increasing costs by 5%-10% every quarter.

Furthermore, “the annual growth rate of HBM demand will approach 200% in 2024 and is expected to double in 2025,” says TrendForce.

For organizations, growing cost considerations, compatibility issues, and the need for specialized hardware will act as barriers to entering the AI market. Additionally, optimizing software to leverage HBM's capabilities remains a complex task requiring collaboration between hardware manufacturers and software developers. For those who can afford HBM’s costs, availability will be the next big problem.

Shortage-like conditions are already affecting the DRAM market, with major suppliers such as Western Digital Corporation (WDC) issuing notices to customers about upcoming price fluctuations. Industry sources have revealed that DRAM products may face shortages through 2H24 following the recent rise in memory contract prices. DRAM suppliers are increasing wafer input for these advancement processes, which will account for 40% of total DRAM wafer input by the end of the year.

Companies must monitor the market's shifting landscape to ensure HBM availability. Market shifts can happen on the turn of a dime, so assessing technology trends within the electronic components industry can help keep procurement teams aware of upcoming bottlenecks, especially when components lack a wide variety of manufacturers producing them.

However, keeping an eye on the market is a time-consuming task that most procurement teams can’t afford—unless they partner with an electronic components distributor and supply chain expert.  

Sourceability’s Experts Can Help You Get Stock for All AI-Capable Chips

As the demand for AI continues to soar, HBM is poised to play a crucial role in driving innovation and pushing the boundaries of what's possible in artificial intelligence. To ensure your organization has access to these solutions, it's vital to collaborate with a global electronic components distributor that offers passionate expertise and streamlined processes.

Sourceability is one such company. On its e-commerce site, Sourcengine, Sourceability offers professional buyers over 1.6 billion part offers from various franchised, authorized, and qualified third parties. Buyers can quickly and efficiently purchase their needed parts, such as SK Hynix’s latest HBM products, through Sourcengine’s marketplace. Procurement teams that need more help sourcing hard-to-find components will be matched with Sourceability’s global experts to get the part offer they need.  

With Sourceability’s market intelligence tool, Datalynq, buyers can gain comprehensive visibility into the electronic components supply chain to stay aware of market shifts and risks. AI will dominate the tech sector in the coming years, and as the demand for memory grows, so will orders for HBM. Sourceability will help ensure your organization is always stocked.  

Quotengine: Your Ultimate BOM Tool
With Quotengine’s real-time data on over 1 billion part offers, managing your BOM effectively has never been simpler.
Upload Your BOM
What’s Your Excess Worth?
Real-time market data, quick response time, and unique price offers to help you maximize your return on excess inventory.
Get an Estimate
The Last Integration You’ll Ever Need
Streamline manual processes and gain real-time access to inventory data, pricing updates, and order tracking through Sourcengine’s API
Sign-up Here
Sourcengine’s Lead Time Report
Strategize for upcoming market shifts through lead time and price trends with our quarterly lead time report.
Download now
Sourcengine’s Lead Time Report
Strategize for upcoming market shifts through lead time and price trends with our quarterly lead time report.
Download now