‘Thermodynamic computing’ could slash energy use of AI image generation by a factor of ten billion, study claims — prototypes show promise but huge task require

'Thermodynamic computing' could slash energy use of AI image generation by a factor of ten billion, study claims — prototypes show promise but huge task require

That's quite the catch, but in a world where AI buildouts and data center growth are putting unprecedented strain on global energy supply, a future process that could reduce AI image generation energy usage by a factor of ten billion would certainly be a breakthrough.

Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.

Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-13/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Stephen Warwick Social Links Navigation News Editor Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games.

DS426 Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. Reply

JRStern DS426 said: Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture. They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle … that would be awesome. Would it ever pay for itself? Dubious. Reply

DS426 JRStern said: They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle … that would be awesome. Would it ever pay for itself? Dubious. That's true, and it would take additional input (heat pump, etc.) to concentrate that thermal energy into higher usable temps, destroying efficiency and overall value of the system. Still, it's hard for me to imagine that we can throw away 300 MW or even more (besides heating the DC itself, which we know is rarely the case based on the fact that most DC's are located in warmer climates) from a single DC. Our ancestors from long ago did really clever things to survive and thrive. We're not so clever today; if the economics work out, that's generally good enough. Reply

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment