Samsung will triple HBM memory production in 2024

tech

This projection exceeds the sales volume of HBM chips that Samsung predicted in January of this year.

Samsung Electronics Co.'s production of High Bandwidth Memory (HBM) chips this year may triple compared to last year, as it aims to take a leading position in the field of Artificial Intelligence (AI) chips.

On Tuesday this week, at the global chip manufacturer gathering Memcon 2024 held in San Jose, California, Hwang Sang-joong, Executive Vice President and head of DRAM products and technology at Samsung, stated that he expects the company's HBM chip production this year to increase by 2.9 times compared to last year.

This forecast is higher than the prediction Samsung made earlier this year at CES 2024, where the chip manufacturer suggested it might produce 2.5 times more HBM chips in 2024.

"Following the mass production of the third-generation HBM2E and the fourth-generation HBM3, we plan to mass-produce the 12-layer fifth-generation HBM and 32-gigabit 128 GB DDR5 products in the first half of this year," Hwang said at Memcon 2024. "Through these products, we hope to enhance our influence in the field of high-performance, high-capacity memory in the era of artificial intelligence."

Advertisement

At the conference, Samsung unveiled its HBM roadmap, anticipating that the HBM shipments in 2026 will be 13.8 times that of 2023. The company stated that by 2028, the annual production of HBM memory will further rise to 23.1 times that of 2023.

In the sixth-generation HBM chip, HBM4, codenamed "Snowbolt," Samsung plans to apply buffer chips (a type of control device) to the bottom layer of stacked memory to improve efficiency.

At this conference, Samsung showcased its latest HBM3E 12H chip to attendees—the industry's first 12-stack HBM3E DRAM, marking the highest capacity breakthrough achieved in HBM technology to date.Samsung is currently providing samples of its HBM3E 12H chips to customers and plans to begin mass production in the first half of the year.

Attendees at the conference included SK Hynix, Microsoft, Meta Platforms, NVIDIA, and AMD.

**CXL Technology**

Samsung also announced the expansion of its Compute Express Link (CXL) memory module product portfolio at Memcon 2024, showcasing its technology in high-performance and high-capacity solutions for AI applications.

In the keynote speech, Choi Jee-hoon, Executive Vice President of Samsung's Device Solutions Research in the United States, stated that Samsung is committed to collaborating with partners to unleash the full potential of the AI era.

"AI innovation cannot continue without memory technology innovation. As a leader in the memory market, Samsung is proud to continue advancing innovation, from the industry's most advanced CMM-B technology to powerful memory solutions like HBM3E, for high-performance computing and demanding AI applications."

To highlight the growing momentum of the CXL ecosystem, Samsung introduced the CXL memory module CMM-B, a cutting-edge CXL DRAM memory product.

The chip manufacturer also showcased the CXL memory module CMM-H for hierarchical memory and the CXL memory module DRAM (CMM-D).

CXL is the next-generation interface that enhances the efficiency of accelerators, DRAM, and other storage devices used alongside CPUs in high-performance server systems.As a laggard in the HBM chip domain, Samsung has invested heavily in HBM to compete with SK Hynix and other memory manufacturers. HBM has become key to the AI boom because it offers faster processing speeds than traditional memory chips.

Last week, Kyung Kye-hyun, the head of Samsung's semiconductor business, stated that the company is developing its next-generation AI chip, Mach-1, aimed at disrupting its competitor SK Hynix, which is a leader in the advanced HBM field.

The company has announced that it has agreed to supply Mach-1 chips to Naver Corp. by the end of this year, with a deal valued up to 1 trillion won ($7.52 billion).

Through this contract, Naver hopes to significantly reduce its dependence on AI chips from the world's top AI chip designer, Nvidia.

SK Hynix: Achieving double-digit AI DRAM sales by 2024

Recently, SK Hynix's CEO Kwak Noh-Jung stated that by 2024, HBM chips for AI chipsets are expected to account for a double-digit percentage of its DRAM chip sales.

In March of this year, Nvidia's suppliers began commercializing the next-generation advanced HBM chips. It is reported that Nvidia may be one of the first beneficiaries of SK Hynix's initial shipments.HBM chips cater to NVIDIA and other GPUs that handle vast amounts of data in generative AI.

SK Hynix is the sole supplier of the HBM3 version that NVIDIA currently uses, capturing 80% of the AI chip market, thereby leading the HBM chip market.

Analysts predict that by 2024, HBM chips will account for 15% of the total industry's DRAM sales, up from 8% in 2023.

Recent reports indicate that SK Hynix plans to invest $4 billion in establishing an advanced chip packaging factory in West Lafayette, Indiana, supported by the U.S. CHIPS Act, potentially creating 800 to 1000 jobs.

*Disclaimer: This article is the original creation of the author. The content of the article represents their personal views. Our reposting is solely for sharing and discussion and does not represent our endorsement or agreement. If there are any objections, please contact the backend.

tech
1172 40

Comment Box