From WikiChip
Difference between revisions of "high-bandwidth memory"

(hbm initial article)
(No difference)

Revision as of 00:55, 3 January 2018

High-Bandwidth Memory (HBM) is a memory interface technology that exploits the large number of signals available through die stacking technologies in order to achieve very high peak bandwidth.

Motivation

See also: memory wall

The development of HBM rose from the need for considerably higher bandwidth and higher memory density. Drivers for the technology are high-performance applications such as high-end graphics and networking (e.g., 100G+ Ethernet, TB+ silicon photonics), and high-performance computing. Unfortunately over the last few decades, memory bandwidth increased at a much slower rate when compared to computing power, widening a bottleneck gap that was already large. HBM was designed to introduce a step function improvement in memory bandwidth.