The HBM4 memory will not appear until 2026 and will
Hardware

The HBM4 memory will not appear until 2026 and will have up to 16 layers

Until now, the only supplier of HBM3 chips for NVIDIA’s needs was the South Korean company SK hynix, but in the case of HBM3e, its competitor Micron Technology began supplying NVIDIA with samples of its products at the end of July, which triggered the battle for the market place in this memory segment be violent. The HBM4 type memory will be ready for the market by 2026, which will increase the number of layers from 12 to 16 a year later.

    Image source: SK hynix

Image source: SK hynix

The near future in the development of HBM3e type memory as explained by TrendForce – This is the release of 8-layer microcircuits, which must pass certification by NVIDIA and other customers by the first quarter of next year, and then move on to the mass production phase. Micron Technology is slightly ahead of SK hynix in this area, having made its samples available for testing a few weeks earlier, but Samsung only managed to do so in early October. The HBM3e memory type is capable of providing information transfer speeds of 8 to 9.2 Gbit/s; eight-layer chips with a capacity of 24 GB are manufactured using class 1-Alpha (Samsung) or 1-Beta (SK Hynix and Micron) technology processes. All three companies will set up mass production by the middle of next year, while the last two expect to do so by the beginning of the second quarter.

In many ways, this schedule will determine the rhythm of the release of new NVIDIA computing accelerators. Next year, the company will begin shipping H200 accelerators with six HBM3e chips, and by the end of the same year B100 accelerators with eight HBM3e chips will be released. In parallel, hybrid solutions with central processors with Arm-compatible architecture called GH200 and GB200 will be released.


Image source: TrendForce

According to TrendForce, rival AMD will focus on using HBM3 memory in the Instinct MI300 accelerator family in 2024, saving the transition to HBM3e for the later Instinct MI350. Memory compatibility testing in this case will begin in the second half of 2024 and actual deliveries of HBM3e chips to AMD will begin no earlier than the first quarter of 2025.

Intel Habana Gaudi 2 accelerators launched in the second half of last year are limited to using six HBM2e stacks; The successors to the Gaudi 3 series will increase the number of stacks to 8 by the middle of next year, but will remain true to the use of HBM2e chips.

The HBM4 memory type will not be introduced until 2026, it will use a 12nm substrate manufactured by contract manufacturers. The number of layers in a memory stack will vary between 12 and 16 pieces, with the latter type of chip not hitting the market until 2027 at the earliest. However, Samsung Electronics expresses its intention to introduce the HBM4 as early as 2025, thereby making up for the time lost compared to previous generations of memory chips in this class.

In the coming years there will also be a trend toward individualization of the design of solutions with HBM storage. In particular, some developers are considering the possibility of integrating such memory chips directly onto chips with computing cores. At least such intentions have already been attributed to NVIDIA, especially when it comes to chips like HBM4.

RELATED TOPICS

About the author

Dylan Harris

Dylan Harris is fascinated by tests and reviews of computer hardware.

Add Comment

Click here to post a comment