
That's quite the catch, but in a world where AI buildouts and data center growth are putting unprecedented strain on global energy supply, a future process that could reduce AI image generation energy usage by a factor of ten billion would certainly be a breakthrough.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-13/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Stephen Warwick Social Links Navigation News Editor Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games.
DS426 Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. Reply
JRStern DS426 said: Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle … that would be awesome. Would it ever pay for itself? Dubious. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/thermodynamic-computing-could-slash-energy-use-of-ai-image-generation-by-a-factor-of-ten-billion-study-claims-prototypes-show-promise-but-huge-task-required-to-create-hardware-that-can-rival-current-models#main
- https://www.tomshardware.com
- Micron starts building new 3D NAND fab in Singapore – Fab 10B promises to more than double the company's local flash production capacity
- This Newegg bundle includes AMD’s best gaming CPU with flagship Asus X870E motherboard and 32GB DDR5 RAM for $1,120 — save $478 and get a free gaming mouse for
- China satellite near miss prompts Starlink to reduce altitudes — more than 4,000 satellites pulled to 300-mile orbit to increase 'space safety'
- Clever Brit successfully repurposes telephone wiring for gigabit internet throughout his vintage home — Lad converts "incomprehensible mess of wires" into high-
- Nvidia DGX Spark review: the GB10 Superchip powers a fast and fun AI toolbox that beats out AMD’s Ryzen AI Max+ 395
Informational only. No financial advice. Do your own research.