SK hynix Announces Near Complete Sale of 2025 HBM Memory Inventory

The Race for High-Performance AI Processors: A Memory Crunch?

Have you heard? The tech world is abuzz, and at the heart of it all is the soaring demand for high-performance processors, especially those designed for AI training. This craze isn’t just about the processors themselves but dives deep into the critical components that power them up. In a rather eye-opening announcement this week, SK hynix let the cat out of the bag: they’re all sold out of their high-bandwidth memory (HBM) production capacity for not just the rest of 2024, but most of 2025 is booked solid too. Can you believe that?

Who’s Buying Up All the Memory?

Let’s name drop a bit, shall we? SK hynix isn’t just playing in the minor leagues. We’re talking major players like Amazon, AMD, Facebook, Google (and Broadcom), Intel, Microsoft, and let’s not forget NVIDIA. NVIDIA, in particular, is gobbling up HBM3 and HBM3E memory for its powerhouse H100/H200/GH200 accelerators. With an appetite that big, it’s clear why the demand for these high-speed memory modules is through the roof.

The Backlog Saga Continues

It’s not just about making an order; it’s about playing the long game. Memory orders, which already require a leap of faith placed months in advance, are now piling up, backlogged all the way into 2025. It’s like everyone’s jockeying to get their piece of the pie, securing those precious stacks of memory that could make or break their success.

A Tale of Two Companies

SK hynix isn’t alone in their “no vacancy” announcement. Micron tossed their hat in the ring a while back with news on their HBM3E production selling out. But let’s put things into perspective; SK hynix’s bombshell carries a bit more weight, given their ginormous production capacity compared to Micron’s. With SK hynix holding a commanding 46% to 49% of the HBM market, and their status barely expected to budge come 2025, the plot thickens, leaving us to wonder, what’s next?

And Then There Was One

With SK hynix and Micron’s plates full, eyes are now turning to Samsung, the remaining titan yet to weigh in on the HBM frenzy. Considering memory is as tradeable as the latest gossip, it’s hard to imagine Samsung isn’t facing the same music. And if that’s the case, an HBM3 memory shortage might just be on the horizon.

Looking Ahead: A Glimmer of Hope?

In an interesting twist, SK hynix is already teasing the tech community, revealing they’re sampling 12-Hi 36GB HBM3E stacks with customers and hinting at volume shipments kicking off in the third quarter. Could this be the light at the end of the tunnel, or merely a tease for what’s shaping up to be an uber-competitive swarm around high-bandwidth memory?

One thing’s for sure, the saga of HBM memory is far from over. As industries push the envelope on AI and machine learning capabilities, the thirst for high-performance processing power and the memory that goes with it is only going to intensify. Strap in, folks; it’s going to be a bumpy ride.

Scroll to Top
Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.