'Thermodynamic computing' could slash energy use of AI image generation by a factor of ten billion, study claims — prototypes show promise but huge task required to create hardware that can rival current models

Google Nano Banana Pro
(Image credit: Google)

A mind-bending new report claims that 'thermodynamic computing' could, in theory, drastically reduce the energy consumed by AI to generate images, using just one ten-billionth of the energy of current popular tools. As reported by IEEE Spectrum, two recent studies hint at the potential of this burgeoning technology, but its proponents admit the solution is rudimentary.

According to the report, Lawrence Berkeley National Laboratory staff scientist Stephen Withelam claims thermodynamic computing could be used for AI image generation "with a much lower energy cost than current digital hardware can." In a January 10 article published by Whitelam and Corneel Casert, also of Berkeley, the pair outlined how "it was possible to create a thermodynamic version of a neural network," laying the foundations for generating images using thermodynamic computing.

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Stephen Warwick
News Editor

Stephen is Tom's Hardware's News Editor with almost a decade of industry experience covering technology, having worked at TechRadar, iMore, and even Apple over the years. He has covered the world of consumer tech from nearly every angle, including supply chain rumors, patents, and litigation, and more. When he's not at work, he loves reading about history and playing video games.