
That's quite the catch, but in a world where AI buildouts and data center growth are putting unprecedented strain on global energy supply, a future process that could reduce AI image generation energy usage by a factor of ten billion would certainly be a breakthrough.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-13/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Stephen Warwick Social Links Navigation News Editor Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games.
DS426 Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. Reply
JRStern DS426 said: Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle … that would be awesome. Would it ever pay for itself? Dubious. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/thermodynamic-computing-could-slash-energy-use-of-ai-image-generation-by-a-factor-of-ten-billion-study-claims-prototypes-show-promise-but-huge-task-required-to-create-hardware-that-can-rival-current-models#main
- https://www.tomshardware.com
- Where to buy AMD's Ryzen 7 9850X3D — the new king of gaming CPUs
- SAMA L70 Review: Competitive performance that punches above its price tag
- Here are the tools I use as Tom’s Hardware resident CPU reviewer — a cheap aluminum open bench, thermal paste wipes, platform-labeled external power buttons, an
- Here are the tools I use as Tom’s Hardware resident CPU reviewer — a cheap aluminum open bench, thermal paste wipes, platform-labeled external power buttons, an
- Alienware's excellent AW2725Q 4K QD-OLED gaming monitor sinks to $799 — save $100 on this high pixel density screen
Informational only. No financial advice. Do your own research.