Date:2024-09-06 15:11:12 Views:319
The world's two biggest memory chip makers, Samsung Electronics and SK Hynix, are racing to supply advanced DRAM chips to customers, including NVIDIA, to take advantage of the artificial intelligence boom. Executives from the two South Korean companies said Wednesday they are doing their best to mass-produce the latest high-bandwidth memory (HBM) chips as soon as possible. Samsung's president and
Lee Jung-bae, head of memory business, said in a keynote address at Semicon Taiwan 2024, an international gathering of semiconductor professionals, “To maximize the performance of AI chips, custom HBM is the best choice. We are working with other foundries to provide more than 20 customized solutions.”
He said Samsung is preparing to launch 256 terabyte (TB) solid state drives (SSDs) for servers to meet the growing demand for high-capacity storage devices for AI servers.
In his keynote address, Justin Kim, president and head of AI Infra at SK Hynix, said, “We are working closely with TSMC to develop HBM 4. Our development is progressing well, so we hope to deliver the chip to our customers at the right time.”
HBM is crucial to the AI boom as it provides much-needed faster processing speeds compared to traditional memory chips.
Among memory chip makers, SK Hynix is the biggest beneficiary of the explosion in AI applications because it dominates the production of HBM, which is critical to generative AI computing, and is the largest supplier of AI chips to Nvidia, which controls 80 percent of the AI chip market.SK Hynix said last month it plans to supply Nvidia with a large number of its AI chips in the fourth quarter.
SK Hynix said last month that it plans to supply Nvidia with large quantities of its latest 12-layer HBM3E chips in the fourth quarter, and SK said it will begin mass production of the chips at the end of this month.
To stay ahead of the curve, SK teamed up with TSMC, the world's largest foundry or contract chipmaker, in April and plans to develop its next-generation AI chip, the HBM4, next year. samsung's HBM3 chip has already passed qualification tests with Nvidia, and it's now working to get the U.S. AI chip designer's HBM3E chip approved.
Samsung starts shipping HBM3E to NVIDIA
Samsung Electronics has reportedly completed quality testing of its fifth-generation high-bandwidth memory (HBM3E) 8-layer products in collaboration with NVIDIA and has begun shipping them in preparation for competing with rivals SK Hynix and Micron in the advanced memory market.
According to TrendForce, a Taiwan-based market research firm, Samsung has started shipping products now, although it entered the HBM3E market later than its competitors.
TrendForce explained that although Samsung entered the HBM3E market later than its rivals SK Hynix and Micron, the company has recently completed the certification process for its HBM3E products. Samsung has reportedly begun shipping its 8-layer HBM3E memory for NVIDIA's H200 GPUs, while the certification process for the upcoming Blackwell series is said to be well underway.
Samsung Electronics responded to this by saying that they could not confirm specific details related to the customer and said that the report contained inaccuracies. Last month, Reuters had reported that Samsung had successfully passed NVIDIA's 8-layer HBM3E product delivery quality test. However, at the time, Samsung also denied the claim, explaining that they were still in the process of testing with key customers.
TrendForce also noted that Micron and SK Hynix completed HBM3E certifications in the first quarter of this year and began volume shipments in the second quarter. According to Trendforce, Micron is currently supplying HBM3E primarily for the H200, while SK Hynix is supplying both the H200 and B100 series.
TrendForce further noted that NVIDIA's H200 GPU is the first GPU to utilize 8-layer HBM3E technology and is expected to have a significant impact on the market this year. In addition, they expect that the upcoming Blackwell chips will also fully adopt HBM3E technology.
According to a report by MoneyDJ.com, Samsung will give customers full flexibility in customizing HBM memory, said Li Zhenpei, president and general manager of the memory business of the device solutions department, when attending an industry event in Taipei today.
Li Zhenpei in his keynote speech entitled “memory technology innovation leap into the future” mentioned that in the AI era memory is facing three major challenges of energy consumption, bandwidth and capacity, Samsung Electronics will be prepared to carry a diversity of new innovative technology, targeting AI devices and edge applications.
For HBM, one possible way to break the existing bottleneck of speed and energy consumption is to integrate logic processing power into memory. The cooperation between memory manufacturers and foundries is crucial in this case, and Samsung's memory business unit has prepared a “turnkey” solution for this purpose.
In addition, Samsung Electronics can also provide customers with the relevant IP to design their own Base Die to create a customized HBM product, and users can even hand over the Base Die to a third-party foundry that is not Samsung Electronics.
Lee emphasized that Samsung Electronics can provide memory, foundry process and packaging one-stop service, together with a wide range of ecosystem partners to meet the different needs of customers.
He also said in his speech, this year, HBM memory shipments are expected to reach 1600GB, more than two times the previous eight years cumulative; and global semiconductor revenue is expected to reach 800 billion U.S. dollars in 2028 (IT home note: currently about 5.7 trillion yuan) scale.