Close Menu
    Trending
    • Trump’s immigration chiefs testify in Congress following protester deaths
    • US Commerce Secretary Lutnick downplays Epstein ties amid resignation calls | Donald Trump News
    • Rockies sign former NPB star to fill out starting rotation
    • Fix DUI test backlog at state labs before lowering blood-alcohol limit  
    • How sports leagues are vying for Gen Z and Gen Alpha’s attention to build the next generation of fans
    • AI Boom Fuels DRAM Shortage and Price Surge
    • WWIII The Documentary | Armstrong Economics
    • Robert Irwin Addresses Possibility Of Being The Next ‘Bachelor’
    The Daily FuseThe Daily Fuse
    • Home
    • Latest News
    • Politics
    • World News
    • Tech News
    • Business
    • Sports
    • More
      • World Economy
      • Entertaiment
      • Finance
      • Opinions
      • Trending News
    The Daily FuseThe Daily Fuse
    Home»Tech News»AI Boom Fuels DRAM Shortage and Price Surge
    Tech News

    AI Boom Fuels DRAM Shortage and Price Surge

    The Daily FuseBy The Daily FuseFebruary 10, 2026No Comments9 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    AI Boom Fuels DRAM Shortage and Price Surge
    Share
    Facebook Twitter LinkedIn Pinterest Email


    If it feels nowadays as if all the things in expertise is about AI, that’s as a result of it’s. And nowhere is that extra true than available in the market for computer memory. Demand, and profitability, for the kind of DRAM used to feed GPUs and different accelerators in AI data centers is so large that it’s diverting away provide of reminiscence for different makes use of and inflicting costs to skyrocket. Based on Counterpoint Research, DRAM costs have risen 80-90 precent to date this quarter.

    The most important AI hardware corporations say they’ve secured their chips out so far as 2028, however that leaves all people else—makers of PCs, client gizmos, and all the things else that should quickly retailer a billion bits—scrambling to take care of scarce provide and inflated costs.

    How did the electronics trade get into this mess, and extra importantly, how will it get out? IEEE Spectrum requested economists and reminiscence consultants to clarify. They are saying right now’s scenario is the results of a collision between the DRAM trade’s historic increase and bust cycle and an AI {hardware} infrastructure build-out that’s with out precedent in its scale. And, barring some main collapse within the AI sector, it should take years for brand spanking new capability and new expertise to deliver provide in step with demand. Costs may keep excessive even then.

    To know each ends of the story, you’ll want to know the principle perpetrator within the provide and demand swing, high-bandwidth reminiscence, or HBM.

    What’s HBM?

    HBM is the DRAM trade’s try and short-circuit the slowing tempo of Moore’s Regulation through the use of 3D chip packaging expertise. Every HBM chip is made up of as many as 12 thinned-down DRAM chips referred to as dies. Every die incorporates quite a lot of vertical connections referred to as by silicon vias (TSVs). The dies are piled atop one another and linked by arrays of microscopic solder balls aligned to the TSVs. This DRAM tower—effectively, at about 750 micrometers thick, it’s extra of a brutalist office-block than a tower—is then stacked atop what’s referred to as the bottom die, which shuttles bits between the reminiscence dies and the processor.

    This advanced piece of expertise is then set inside a millimeter of a GPU or different AI accelerator, to which it’s linked by as many as 2,048 micrometer-scale connections. HBMs are connected on two sides of the processor, and the GPU and reminiscence are packaged collectively as a single unit.

    The thought behind such a decent, highly-connected squeeze with the GPU is to knock down what’s referred to as the memory wall. That’s the barrier in vitality and time of bringing the terabytes per second of knowledge wanted to run large language models into the GPU. Memory bandwidth is a key limiter to how briskly LLMs can run.

    As a expertise, HBM has been round for more than 10 years, and DRAM makers have been busy boosting its functionality.

    As the scale of AI models has grown, so has HBM’s significance to the GPU. However that’s come at a price. SemiAnalysis estimates that HBM typically prices 3 times as a lot as different kinds of reminiscence and constitutes 50 % or extra of the price of the packaged GPU.

    Origins of the memory chip scarcity

    Reminiscence and storage trade watchers agree that DRAM is a extremely cyclical trade with large booms and devastating busts. With new fabs costing US $15 billion or extra, corporations are extraordinarily reluctant to develop and should solely have the money to take action throughout increase occasions, explains Thomas Coughlin, a storage and reminiscence knowledgeable and president of Coughlin Associates. However constructing such a fab and getting it up and operating can take 18 months or extra, virtually making certain that new capability arrives effectively previous the preliminary surge in demand, flooding the market and miserable costs.

    The origins of right now’s cycle, says Coughlin, go all the way in which again to the chip supply panic surrounding the COVID-19 pandemic . To keep away from supply-chain stumbles and help the fast shift to remote work, hyperscalers—information heart giants like Amazon, Google, and Microsoft—purchased up large inventories of reminiscence and storage, boosting costs, he notes.

    However then provide grew to become extra common and information heart enlargement fell off in 2022, inflicting reminiscence and storage costs to plummet. This recession continued into 2023, and even resulted in huge reminiscence and storage corporations resembling Samsung chopping manufacturing by 50 % to attempt to hold costs from going under the prices of producing, says Coughlin. It was a uncommon and pretty determined transfer, as a result of corporations usually must run crops at full capability simply to earn again their worth.

    After a restoration started in late 2023, “all of the reminiscence and storage corporations have been very cautious of accelerating their manufacturing capability once more,” says Coughlin. “Thus there was little or no funding in new manufacturing capability in 2024 and thru most of 2025.”

    chart visualization

    The AI information heart increase

    That lack of recent funding is colliding headlong with an enormous increase in demand from new information facilities. Globally, there are nearly 2,000 new data centers both deliberate or below development proper now, in accordance with Knowledge Heart Map. In the event that they’re all constructed, it could signify a 20 % bounce within the international provide, which stands at round 9,000 services now.

    If the present build-out continues at tempo, McKinsey predicts corporations will spend $7 trillion by 2030, with the majority of that—$5.2 trillion—going to AI-focused information facilities. Of that chunk, $3.3 billion will go towards servers, data storage, and community tools, the agency predicts.

    The most important beneficiary to date of the AI information heart increase is certainly GPU-maker Nvidia. Income for its information heart enterprise went from barely a billion in the final quarter of 2019 to $51 billion in the quarter that ended in October 2025. Over this era, its server GPUs have demanded not simply increasingly gigabytes of DRAM however an rising variety of DRAM chips. The just lately launched B300 makes use of eight HBM chips, every of which is a stack of 12 DRAM dies. Rivals’ use of HBM has largely mirrored Nvidia’s. AMD’s MI350 GPU, for instance, additionally makes use of eight, 12-die chips.

    chart visualization

    With a lot demand, an rising fraction of the income for DRAM makers comes from HBM. Micron—the quantity three producer behind SK Hynix and Samsung—reported that HBM and other cloud-related memory went from being 17 % of its DRAM income in 2023 to almost 50 % in 2025.

    Micron predicts the whole marketplace for HBM will develop from $35 billion in 2025 to $100 billion by 2028—a determine bigger than your entire DRAM market in 2024, CEO Sanjay Mehrotra told analysts in December. It’s reaching that determine two years sooner than Micron had beforehand anticipated. Throughout the trade, demand will outstrip provide “considerably… for the foreseeable future,” he stated.

    chart visualization

    Future DRAM provide and expertise

    “There are two methods to deal with provide points with DRAM: with innovation or with constructing extra fabs,” explains Mina Kim, an economist with the Mkecon Insights. “As DRAM scaling has turn out to be harder, the trade has turned to superior packaging… which is simply utilizing extra DRAM.”

    Micron, Samsung, and SK Hynix mixed make up the overwhelming majority of the reminiscence and storage markets, and all three have new fabs and services within the works. Nevertheless, these are unlikely to contribute meaningfully to bringing down costs.

    Micron is within the strategy of building an HBM fab in Singapore that needs to be in manufacturing in 2027. And it’s retooling a fab it bought from PSMC in Taiwan that may start manufacturing within the second half of 2027. Final month, Micron broke ground on what shall be a DRAM fab advanced in Onondaga County, N.Y. It won’t be in full manufacturing till 2030.

    Samsung plans to start producing at a brand new plant in Pyeongtaek, South Korea in 2028.

    SK Hynix is constructing HBM and packaging services in West Lafayette, Indiana set to start manufacturing by the top of 2028, and an HBM fab it’s building in Cheongju needs to be full in 2027.

    Talking of his sense of the DRAM market, Intel CEO Lip-Bu Tan informed attendees on the Cisco AI Summit final week: “There’s no reduction till 2028.”

    With these expansions unable to contribute for a number of years, different elements shall be wanted to extend provide. “Reduction will come from a mix of incremental capability expansions by current DRAM leaders, yield enhancements in advanced packaging, and a broader diversification of provide chains,” says Shawn DuBravac , chief economist for the Global Electronics Association (previously the IPC). “New fabs will assist on the margin, however the sooner beneficial properties will come from course of studying, higher [DRAM] stacking effectivity, and tighter coordination between reminiscence suppliers and AI chip designers.”

    So, will costs come down as soon as a few of these new crops come on line? Don’t guess on it. “Normally, economists discover that costs come down rather more slowly and reluctantly than they go up. DRAM right now is unlikely to be an exception to this common remark, particularly given the insatiable demand for compute,” says Kim.

    Within the meantime, applied sciences are within the works that might make HBM a good larger client of silicon. The usual for HBM4 can accommodate 16 stacked DRAM dies, despite the fact that right now’s chips solely use 12 dies. Attending to 16 has lots to do with the chip stacking expertise. Conducting warmth by the HBM “layer cake” of silicon, solder, and help materials is a key limiter to going greater and in repositioning HBM inside the package to get much more bandwidth.

    SK Hynix claims a warmth conduction benefit by a producing course of referred to as superior MR-MUF (mass reflow molded underfill). Additional out, an alternate chip stacking expertise referred to as hybrid bonding might assist warmth conduction by lowering the die-to-die vertical distance primarily to zero. In 2024, researchers at Samsung proved they might produce a 16-high stack with hybrid bonding, they usually recommended that 20 dies was not out of reach.

    From Your Website Articles

    Associated Articles Across the Internet



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    The Daily Fuse
    • Website

    Related Posts

    IEEE Honors Innovators Shaping AI and Education

    February 9, 2026

    Bulk RRAM: Scaling the AI Memory Wall

    February 9, 2026

    3D Modeling Made Accessible for Blind Programmers

    February 7, 2026

    IEEE Online Mini-MBA Helps Fill AI Skills Gaps

    February 6, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    The GENIUS Act & Stable Coins – A Repeat Of 1863? Debt Crisis?

    July 21, 2025

    Nevada Woman Admits to Fraudulently Seeking Nearly $100M in COVID-19 Tax Credits

    February 17, 2025

    Suns adding franchise icon as senior advisor

    September 22, 2025

    Celebrities converge on Venice for Bezos-Sanchez wedding gala

    June 26, 2025

    Trump says no one ‘off the hook’ on tariffs but markets rise

    April 14, 2025
    Categories
    • Business
    • Entertainment News
    • Finance
    • Latest News
    • Opinions
    • Politics
    • Sports
    • Tech News
    • Trending News
    • World Economy
    • World News
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Thedailyfuse.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.