Term of the Moment

Core M


Look Up Another Term


Redirected from: HBM3

Definition: high bandwidth memory


A high-speed interface for memory chips adopted by JEDEC in 2013. Used with the GPUs designed for AI training and other high-performance applications, high bandwidth memory (HBM) uses a 3D stacked architecture of DRAM (dynamic RAM) modules. In time, high bandwidth memory is expected to be employed in laptops because of its space savings compared to low-power DDR (see LPDDR SDRAM). Micron (U.S), Samsung and SK Hynix (South Korea) are major HBM manufacturers. See JEDEC.

A Much Wider Interface
The 4096-bit interface connecting HBM memory to the CPU or GPU is eight times wider than the 512 bits used for DDR and GDDR memory. See Hybrid Memory Cube, GDDR and DDR.




A Micron 24GB HBM3e Cube
In 2024, Micron was first to introduce a high bandwidth memory cube that stacks eight modules for 24GB of HBM3e memory. (Image courtesy of Micron Technology, Inc.)


                 Data  Max Dies X
       Date of   Rate   Capacity
       Release  (GB/s)    (GB)    Volts

  HBM4   2026    1638   16x4=64    0.4

  HBM3e  2023    1229   16x3=48    0.4
  HBM3   2022     819   12x2=24    0.4

  HBM2e  2019     461   12x2=25    1.2
  HBM2   2016     307    8x1=8     1.2

  HBM1   2013     128    4x1=4     1.2





HBM in a Superchip
NVIDIA's Superchip uses HBM memory for its built-in GPU. See NVIDIA Grace Hopper Superchip.