
Bruno Ferreira is a contributing writer for Tom's Hardware. He has decades of experience with PC hardware and assorted sundries, alongside a career as a developer. He's obsessed with detail and has a tendency to ramble on the topics he loves. When not doing that, he's usually playing games, or at live music shows and festivals. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-11/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Bruno Ferreira Contributor Bruno Ferreira is a contributing writer for Tom's Hardware. He has decades of experience with PC hardware and assorted sundries, alongside a career as a developer. He's obsessed with detail and has a tendency to ramble on the topics he loves. When not doing that, he's usually playing games, or at live music shows and festivals.
bit_user The article said: The shortages are driven by explosive AI demand, and the latest report says that up to 70 percent of the memory produced worldwide in 2026 will be consumed by data centers. First, how much of it did they historically consume? At least, over the last few years? Second, is this before or after accounting for the latest price projections? Finally, when people talk about "memory chips", in general, I assume they mean NAND and DRAM. You mentioned "RAM", but some confirmation they're not also talking about NAND chips would be nice. If you do mean just DRAM, I'd suggest using that in the headline, to minimize room for confusion. The article said: The tech press has been lit up like Chernobyl reactor #4 … the fallout from the RAM shortage is set to irradiate … I see what you did there. ; ) Reply
Zaranthos Supply and demand. This will eventually fix itself. They cut production and then the AI boom started gobbling up memory supply. Now they're building more manufacturing. Eventually the AI boom will soften as compute exceeds demand and then the memory makers will beg the average consumer to build new computers and buy their stuff at discounted prices for more advanced products. We all win in the end as technology rockets forward and eventually supply again exceeds demand. Maybe my next computer build will us cheap HBM instead of RAM or something better. Until then my latest computer build is not all that much faster for daily use than my last computer despite being more expensive and over 10 years more advanced. Any good computer build has a lot of life in it even for decent gaming with minimal upgrades. I don't have to keep buying your overpriced stuff, I can simply wait until you price it so it's worthwhile for me to upgrade. Reply
valthuer Data centers eating 70% of global memory just shows how transformative this tech really is. But starving automotive and consumer electronics of even legacy RAM feels like an industry coordination failure, not progress. This is a once-in-a-generation shift, and it needs smarter capacity planning — otherwise the AI revolution ends up making everyday tech worse and more expensive for everyone. Incredible future ahead… if we don’t bottleneck ourselves on memory. Reply
Neilbob I think I still have 4Gb of Crucial Ballistix DDR-400 memory kicking around somewhere. I could get the timings on it really tight as well. Wonder if it's possible that it could gain a new lease of life. Reply
JayGau Zaranthos said: Supply and demand. This will eventually fix itself. They cut production and then the AI boom started gobbling up memory supply. Now they're building more manufacturing. Eventually the AI boom will soften as compute exceeds demand and then the memory makers will beg the average consumer to build new computers and buy their stuff at discounted prices for more advanced products. We all win in the end as technology rockets forward and eventually supply again exceeds demand. Maybe my next computer build will us cheap HBM instead of RAM or something better. Until then my latest computer build is not all that much faster for daily use than my last computer despite being more expensive and over 10 years more advanced. Any good computer build has a lot of life in it even for decent gaming with minimal upgrades. I don't have to keep buying your overpriced stuff, I can simply wait until you price it so it's worthwhile for me to upgrade. Your next computer will be in the cloud and cost you a 65$/month subscription fee. Those corporations don't even hide it anymore, Bezos said it clearly last week. They want to control everything and killing personal computers is the next step. You say you don't need to buy their expensive stuff. You gonna have fun when a crappy virtual computer with awful input lag will cost you almost 1000$ a year, and you won't even own it. Reply
Nomadish JayGau said: I hope you are right, but like I said, big-tech CEOs are now openly saying it's their goal. The only way to prevent it is to acknowledge it's happening and fight it by refusing to use subscription services. It's ironic but this could definitely happen if people stop buying hardware, thinking that it will force those corporations to listen to us and lower their prices, while it's exactly what those CEOs want so they can sell us their subscriptions at attracting prices at first and ramp them up when it becomes the only available option. Naw. Boycotts dont work. They dont care. If you want them to care you have to make them and if you cant hit them in their profits and they have bought Washington those options are few. This is all part of a design to rob us of our ability to own our own pcs and there are a huge number of reasons this can never be allowed to happen and some of those reasons are not being talked about. The government has made many deals with big tech including just outright buying a large stake in Intel. If we are moved to cloud computing what do you think that means? To me it means the government has leverage to punish companies that dont do what they want like turning over user data and limiting/controlling the flow of information. Reply
LordVile valthuer said: Data centers eating 70% of global memory just shows how transformative this tech really is. But starving automotive and consumer electronics of even legacy RAM feels like an industry coordination failure, not progress. This is a once-in-a-generation shift, and it needs smarter capacity planning — otherwise the AI revolution ends up making everyday tech worse and more expensive for everyone. Incredible future ahead… if we don’t bottleneck ourselves on memory. How is it transformative exactly? Currently all it’s doing is regurgitating slop and being frequently wrong. Also it’s not scaling with hardware and all good data that is available has been expended in training. This is with no end product in sight and no one aside from Nvidia turning a profit. At some stage this will collapse as what possible product could come out and be profitable enough to ever turn a profit? Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/ram/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/ram/data-centers-will-consume-70-percent-of-memory-chips-made-in-2026-supply-shortfall-will-cause-the-chip-shortage-to-spread-to-other-segments#main
- https://www.tomshardware.com
- be quiet! Pure Power 13 M 650W power supply review: Balanced performance and reliability
- Microsoft built a ‘Community-First AI Infrastructure’ framework for its data center projects — new policy may be the blueprint for U.S hyperscalers to follow
- Erroneously assembled 1974 Altair 8800 computer gets fixed and enjoys first run in 2026 — Intel 8080 powered machine ran its first program 52 years later
- Gamer builds ‘hardcore’ first-person shooter simulator that actually shoots back — gaming PC also has real weather effects
- NVIDIA Unveils Multi-Agent Intelligent Warehouse and Catalog Enrichment AI Blueprints to Power the Retail Pipeline
Informational only. No financial advice. Do your own research.