
Bruno Ferreira is a contributing writer for Tom's Hardware. He has decades of experience with PC hardware and assorted sundries, alongside a career as a developer. He's obsessed with detail and has a tendency to ramble on the topics he loves. When not doing that, he's usually playing games, or at live music shows and festivals. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-11/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Bruno Ferreira Contributor Bruno Ferreira is a contributing writer for Tom's Hardware. He has decades of experience with PC hardware and assorted sundries, alongside a career as a developer. He's obsessed with detail and has a tendency to ramble on the topics he loves. When not doing that, he's usually playing games, or at live music shows and festivals.
bit_user The article said: The shortages are driven by explosive AI demand, and the latest report says that up to 70 percent of the memory produced worldwide in 2026 will be consumed by data centers. First, how much of it did they historically consume? At least, over the last few years? Second, is this before or after accounting for the latest price projections? Finally, when people talk about "memory chips", in general, I assume they mean NAND and DRAM. You mentioned "RAM", but some confirmation they're not also talking about NAND chips would be nice. If you do mean just DRAM, I'd suggest using that in the headline, to minimize room for confusion. The article said: The tech press has been lit up like Chernobyl reactor #4 … the fallout from the RAM shortage is set to irradiate … I see what you did there. ; ) Reply
Zaranthos Supply and demand. This will eventually fix itself. They cut production and then the AI boom started gobbling up memory supply. Now they're building more manufacturing. Eventually the AI boom will soften as compute exceeds demand and then the memory makers will beg the average consumer to build new computers and buy their stuff at discounted prices for more advanced products. We all win in the end as technology rockets forward and eventually supply again exceeds demand. Maybe my next computer build will us cheap HBM instead of RAM or something better. Until then my latest computer build is not all that much faster for daily use than my last computer despite being more expensive and over 10 years more advanced. Any good computer build has a lot of life in it even for decent gaming with minimal upgrades. I don't have to keep buying your overpriced stuff, I can simply wait until you price it so it's worthwhile for me to upgrade. Reply
valthuer Data centers eating 70% of global memory just shows how transformative this tech really is. But starving automotive and consumer electronics of even legacy RAM feels like an industry coordination failure, not progress. This is a once-in-a-generation shift, and it needs smarter capacity planning — otherwise the AI revolution ends up making everyday tech worse and more expensive for everyone. Incredible future ahead… if we don’t bottleneck ourselves on memory. Reply
Neilbob I think I still have 4Gb of Crucial Ballistix DDR-400 memory kicking around somewhere. I could get the timings on it really tight as well. Wonder if it's possible that it could gain a new lease of life. Reply
JayGau Zaranthos said: Supply and demand. This will eventually fix itself. They cut production and then the AI boom started gobbling up memory supply. Now they're building more manufacturing. Eventually the AI boom will soften as compute exceeds demand and then the memory makers will beg the average consumer to build new computers and buy their stuff at discounted prices for more advanced products. We all win in the end as technology rockets forward and eventually supply again exceeds demand. Maybe my next computer build will us cheap HBM instead of RAM or something better. Until then my latest computer build is not all that much faster for daily use than my last computer despite being more expensive and over 10 years more advanced. Any good computer build has a lot of life in it even for decent gaming with minimal upgrades. I don't have to keep buying your overpriced stuff, I can simply wait until you price it so it's worthwhile for me to upgrade. Your next computer will be in the cloud and cost you a 65$/month subscription fee. Those corporations don't even hide it anymore, Bezos said it clearly last week. They want to control everything and killing personal computers is the next step. You say you don't need to buy their expensive stuff. You gonna have fun when a crappy virtual computer with awful input lag will cost you almost 1000$ a year, and you won't even own it. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/ram/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/ram/data-centers-will-consume-70-percent-of-memory-chips-made-in-2026-supply-shortfall-will-cause-the-chip-shortage-to-spread-to-other-segments#main
- https://www.tomshardware.com
- NVIDIA Brings GeForce RTX Gaming to More Devices With New GeForce NOW Apps for Linux PC and Amazon Fire TV
- AMD vows to fight for gamers as DRAM shortage sends GPU prices skyrocketing — Radeon GPU prices have already surged over 10%
- NVIDIA DGX Spark and DGX Station Power the Latest Open-Source and Frontier Models From the Desktop
- NVIDIA DRIVE AV Software Debuts in All-New Mercedes-Benz CLA
- CEOs of NVIDIA and Lilly Share ‘Blueprint for What Is Possible’ in AI and Drug Discovery
Informational only. No financial advice. Do your own research.