A high-speed interface for memory chips adopted by JEDEC in 2013. Used in AI and other high-performance applications, high bandwidth memory (HBM) uses a 3D stacked architecture of DRAM (dynamic RAM) modules. In time, high bandwidth memory is expected to be employed in laptops because of its space savings compared to low-power DDR (see
LPDDR SDRAM). See
JEDEC.
The stack of DRAM modules is typically connected to the CPU or GPU via an interposer substrate that provides a bridge between the connectors.
A Much Wider Interface
The 4096-bit interface connecting HBM memory to the CPU or GPU is eight times wider than the 512 bits used for DDR or GDDR memory. See
Hybrid Memory Cube,
GDDR and
DDR.
A Micron 24GB HBM3e Cube
In 2024, Micron was first to introduce a high bandwidth memory cube that stacks eight modules for 24GB of HBM3e memory. (Image courtesy of Micron Technology, Inc.)
Band Max
Width Capacity
Channels (GB/s) (GB) Volts
HBM 8 128 16 1.2
HBM2 16 256 16 1.2
HBM2e 16 460 36 1.2
HBM3 32 819 64 0.4
HBM3e 32 1200 64 0.4
HBM in a Superchip