
That's quite the catch, but in a world where AI buildouts and data center growth are putting unprecedented strain on global energy supply, a future process that could reduce AI image generation energy usage by a factor of ten billion would certainly be a breakthrough.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-13/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Stephen Warwick Social Links Navigation News Editor Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games.
DS426 Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. Reply
JRStern DS426 said: Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle … that would be awesome. Would it ever pay for itself? Dubious. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/thermodynamic-computing-could-slash-energy-use-of-ai-image-generation-by-a-factor-of-ten-billion-study-claims-prototypes-show-promise-but-huge-task-required-to-create-hardware-that-can-rival-current-models#main
- https://www.tomshardware.com
- Memory prices show signs of levelling out, albeit at inflated levels — some RAM modules stabilizing in price, increases on higher-end kits tapering off
- From Pilot to Profit: Survey Reveals the Financial Services Industry Is Doubling Down on AI Investment and Open Source
- More Ways to Play, More Games to Love — GeForce NOW Wraps CES With Linux Support, Fire TV App, Flight Stick Controls
- 'Thermodynamic computing' could slash energy use of AI image generation by a factor of ten billion, study claims — prototypes show promise but huge task require
- Micron starts building new 3D NAND fab in Singapore – Fab 10B promises to more than double the company's local flash production capacity
Informational only. No financial advice. Do your own research.