Back to stories
Industry

Samsung and AMD Sign Strategic Deal on HBM4 Memory for Next-Gen AI Chips

Michael Ouroumis2 min read
Samsung and AMD Sign Strategic Deal on HBM4 Memory for Next-Gen AI Chips

Samsung Electronics and AMD announced a sweeping strategic partnership on March 18, cementing Samsung as the primary supplier of next-generation HBM4 memory for AMD's upcoming AI accelerators.

The Memorandum of Understanding was signed at Samsung's advanced chip manufacturing complex in Pyeongtaek, South Korea, with AMD Chair and CEO Dr. Lisa Su and Samsung Vice Chairman and CEO Young Hyun Jun both in attendance.

What the Deal Covers

At its core, the agreement positions Samsung as the go-to HBM4 supplier for the AMD Instinct MI455X GPU — the successor to the MI400 that first challenged NVIDIA's dominance — designed to compete head-on with NVIDIA in the data center AI market.

Samsung's HBM4 is built on the company's most advanced 6th-generation 10-nanometer-class DRAM process and a 4nm logic base die. The memory delivers processing speeds up to 13 gigabits per second with a maximum bandwidth of 3.3 terabytes per second — figures that exceed current industry standards.

Beyond HBM4, the partnership extends to high-performance DDR5 memory optimized for AMD's 6th Gen EPYC server CPUs, codenamed "Venice," which will power AMD's Helios rack-scale architecture for enterprise data centers.

Foundry Ambitions

Perhaps most intriguingly, the two companies will also explore a foundry partnership, with Samsung potentially providing chip fabrication services for future AMD products. This would mark a significant shift in AMD's manufacturing strategy, which has relied heavily on TSMC in recent years.

Why It Matters

The partnership signals AMD's aggressive push to close the gap with NVIDIA in the AI accelerator market. By locking in Samsung as a primary HBM4 supplier, AMD secures a critical component of its AI hardware roadmap at a time when high-bandwidth memory demand far outstrips supply.

For Samsung, the deal represents a major win in its efforts to regain ground in the AI memory market. The Korean chipmaker has faced challenges keeping pace with SK Hynix, which secured early HBM contracts with NVIDIA. This AMD partnership gives Samsung a flagship customer for its most advanced memory technology.

Industry Implications

The AI chip supply chain continues to consolidate around a handful of strategic alliances. With NVIDIA relying heavily on SK Hynix and Micron for its memory needs, the Samsung-AMD axis creates a distinct second pillar in the AI hardware ecosystem — one that could intensify competition and potentially ease the memory bottlenecks that have constrained AI infrastructure buildouts throughout 2025 and into 2026.

Learn AI for Free — FreeAcademy.ai

Take "AI for Business: Practical Implementation" — a free course with certificate to master the skills behind this story.

More in Industry

Cerebras Files For IPO At $23B Valuation, Eyes May Nasdaq Debut
Industry

Cerebras Files For IPO At $23B Valuation, Eyes May Nasdaq Debut

Nvidia rival Cerebras Systems filed its long-delayed S-1 this weekend, setting up a mid-May Nasdaq listing on the back of a $10B+ OpenAI compute deal and $510M in 2025 revenue.

10 hours ago2 min read
Factory Hits $1.5B Valuation as AI Coding Droids Land at Nvidia, Morgan Stanley
Industry

Factory Hits $1.5B Valuation as AI Coding Droids Land at Nvidia, Morgan Stanley

Factory raised $150M Series C at a $1.5B valuation to scale its enterprise 'Droids'—AI agents that write, test, review, and deploy code for customers including Nvidia, Adobe, Morgan Stanley, and MongoDB.

16 hours ago2 min read
'Tokenmaxxing' Paradox: AI Coding Tools Boost Throughput 2x at 10x the Cost
Industry

'Tokenmaxxing' Paradox: AI Coding Tools Boost Throughput 2x at 10x the Cost

New data from Faros AI, Jellyfish, and Waydev reveals AI coding tools are inflating token budgets and code churn — developers accept more code, then revise it right back out.

19 hours ago2 min read