
Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips has passed Nvidia's tests for use in its artificial intelligence (AI) processors. Samsung has been struggling for some time to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work.
According to sources, Samsung and Nvidia are yet to sign a supply deal for the approved eight-layer HBM chips and they expect supplies to start by the fourth quarter of 2024.
The South Korean technology giant's 12-layer version of HBM3E chips, however, are yet to pass Nvidia's tests.
HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications.
Samsung has been seeking to pass Nvidia's tests for HBM3E and preceding fourth-generation HBM3 models since last year but has struggled due to heat and power consumption issues.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.