MEMBER EXCLUSIVE

Massive AI data center buildouts are squeezing energy supplies — New energy methods are being explored as power demands are set to skyrocket

Data Center
(Image credit: Getty Images / The Washington Post)

The world is building the electricity system for artificial intelligence on the fly. Over the next five years, the power appetite of data-center campuses built for training and serving large AI models will collide with the reality of permitting, transmission backlogs, and siting constraints. All that could materially change how much, from where, and what type of energy supplies we obtain.

The International Energy Agency (IEA) projects global data centre electricity demand will more than double to 945 terawatt-hours by 2030, with AI being the largest single driver of the rise. European energy demand alone could jump from about 96 TWh in 2024 to 168 TWh by 2030.

The enormous rise in energy demand didn’t just start with the November 2022 release of ChatGPT. The electricity use of hyperscalers has been rising more than 25% for seven years running, according to analysis by Barclays. But the amount of energy needed for AI inference and training has seen those already prodigious rises increase even further.

“AI has been rolled out everywhere,” Chris Preist, professor of sustainability and computer systems at the University of Bristol, said in an interview with Tom’s Hardware Premium. “’Everything everywhere all at once’ is the phrase I like to use for AI,” Preist said. “It’s doing what technologies have always done, but it's doing it at a far, far higher speed.”

AMD Helios rack system.

(Image credit: AMD)

People like Sam Altman are enormously bullish on the future of AI, but are hyperconscious of the energy crunch that it creates. In an essay published in late September, Altman wrote of his vision “to create a factory that can produce a gigawatt of new AI infrastructure every week.” He added: “The execution of this will be extremely difficult; it will take us years to get to this milestone and it will require innovation at every level of the stack, from chips to power to building to robotics.”

Massive demand, little time

As Altman says, everything affects our energy systems. The IEA’s forecasts have proven to be a wake-up call for the energy sector and the AI industry. Anthropic has stated that it believes it'll need 2GW and 5GW data centres as standard to develop its advanced AI models in 2027 and 2028. It also forecasts that the total frontier AI training demand in the United States will reach up to 25GW by 2028. That’s just for training: inference will add the same amount to that, Anthropic forecasts, meaning the US alone will need 50GW of AI capacity by 2028 to retain its world-leading position.

Already, the data centre sector is responding by building out huge numbers of new projects: spending on U.S. data centre construction grew 59% in 2023 to $20 billion, and to $31 billion – another 56% leap – in 2024.

In 2021, pre-ChatGPT, annual private data construction was only around $10 billion.

All those data centres need a reliable supply to power them, so companies are starting to consider how to mitigate an energy crunch that strains grids to their breaking point by building lots of new generation capacity. But questions remain about whether it’s needed.

Nvidia

(Image credit: Nvidia)

Preist is at pains to point out that globally, we have a surfeit of energy supply. “In the case of digital tech, it's actually a local shortfall, not a global shortfall,” he said. But in areas where AI demand is high, there’s often a gap between the demand to power the AI systems being used and the supply for them.

Having access to reliable power sources is a key consideration for those building data centres, with 84% of data centre leaders telling Bloom Energy in an April 2025 survey that it was their top consideration in where they choose to build – more than twice as important as proximity to end users, or the level and type of local regulations they would face.

What type of energy supplies are being built is also changing. A radical shift is taking place with AI companies moving towards firm clean power underpinned by nuclear and geothermal. Constellation is fast-tracking a restart of the shuttered Three Mile Island Unit 1 (renamed the Crane Clean Energy Centre) after striking a 20-year power purchase agreement (PPA) with Microsoft, which, if it meets its ambitious target of a 2027 restart date, could be the first full restart of a U.S. nuclear plant.

Analysts estimate Microsoft is paying a premium for certainty of supply. Amazon has also funded a $500 million raise for X-energy and set a target to deploy more than 5GW of small modular reactors across the US by 2039, pairing equity stakes with future PPAs. Both deals are designed to bankroll reliable low-carbon supplies for their AI campuses.

Supply is changing

The shift isn’t just in what’s built, but how it’s supplied. Rather than annual renewable matching, buyers are signing hour-by-hour, location-specific carbon-free supply and paying for storage to firm it. Data centers are being placed in areas with low-carbon energy supplies and faster planning processes. But the hard limit remains the wires. Even where generation exists, securing a new high-voltage tie-in can take years, so AI data campuses are planning for staged rollout, and until they are fully complete, temporary solutions to build out capacity.

AI firms, such as xAI, are spending big to try to install energy supplies as quickly as they can. However, Izzy Woolgar, director of external affairs at the Centre for Net Zero, said in an interview with Tom's Hardware Premium that the purported surfeit in supply might not be as great as initially thought. “We know data centres and electricity are driving up demand today,” she said. ‘This creates two challenges: first to accurately forecast the energy required to power those centres, and secondly, to meet that demand as quickly and cleanly as possible in the context of an already strained grid.”

Quick and dirty

Wind Turbines offshore

(Image credit: Getty Images / Daniel Leal)

Choosing clean energy options is tricky, partly because alternative power sources aren’t always available, and energy demand must be met immediately to support the massive needs of AI. Developers can sign record renewable deals, but connecting new supply remains the hard limit. In the United States, the median time from requesting a grid interconnection to switching on stretched to nearly five years for projects completed in 2022–23 – up from three years in 2015 and less than two years in 2008.

That delay is system-wide and rising across regions. “The quickest way of getting around that is to install energy sources at the data centre itself,” said Preist. “Ideally, those would be renewable, but often the quickest way of getting it in places is mobile gas generation.” It means that we’re seeing increased demand for all types of energy, including fossil fuels.

Those constraints on the grid and supplies more generally are why tech companies are spending billions to invest in small modular reactors and other supply sources. “We are rapidly building out our infrastructure as part of the energy transition, but confronted by a congested grid, data centre operators are exploring bypassing delays and seeking out direct connections to generators,” Woolgar said. She pointed out that companies have explored hook-ups to gas-fired power plants, and many are investing heavily in unproven technologies such as nuclear small modular reactors (SMRs), which are already being invested in and showcased by the likes of Amazon. But many SMRs need to clear hurdles of red tape, which won't make it feasible for all until the end of the decade.

And that’s the issue, said Woolgar. “The long development times of these major projects will not meet the immediate demands of AI, and we are overlooking the ability of proven and reliable clean technology like solar, wind, and battery storage that can be deployed in as little as two to three years,” she said. As well as things like small modular reactors, other alternatives could be pursued. One of those areas is renewable microgrids, which the Centre for Net Zero’s modelling indicates could cost 43% less to run than nuclear alternatives. That can help the world get power where it is needed in the short term, Woolgar explained.

There are also environmental considerations that are resulting in an energy crunch. Water for cooling is becoming a terminal issue in many regions. In areas of high power density, there’s a consideration towards direct liquid and immersion cooling to try and slash water demand and reduce fan-based cooling.

Build it and they will come

Open AI and Nvidia logos

(Image credit: OpenAI / Nvidia)

Given all the debate around whether we’re currently in an AI bubble, the level of energy supply expansion has some worried. “Predicted future surges in demand from AI could well be overplayed, as current projections often fail to account for emerging or future breakthroughs,” said Woolgar.

Preist is also worried about whether we’re about to spend vast sums to build energy supplies that won’t be needed if the visions of the future powered by AI turn out to be more science fiction than fact. “There is a risk that energy companies overprovision for a bubble which then bursts, and then you're stuck,” he said. “You’re stuck with a load of infrastructure, gas power stations, et cetera, which the residential people need to effectively pay for through overpriced energy.”

Woolgar explained that she believed the current plans for building out energy infrastructure to account for future AI demand didn’t account for improvements in technology efficiency. “The upfront energy costs of training models tend to appear quickly, while the downstream benefits and advancements are less immediate and certain,” she said. For that reason, it was likely that the huge buildout won’t need to be quite as large as expected. “The requirements to train models today won’t remain constant,” said Woolgar. She points out that DeepSeek’s R1 model, launched earlier this year, was trained using just 10% of Meta’s Llama computational resources, “and chips will inevitably get more efficient when it comes to energy consumption.”

Working smarter, not harder?

Submer

(Image credit: Submer)

There’s also the idea that, as well as becoming more efficient, infrastructure providers can become more intelligent about recycling waste outputs, such as heat, from the massive data centres that will have to be built to meet society’s insatiable AI demand. “There is significant heat, and ideally it would be put into district heating systems,” said Preist, pointing to the design of Isambard AI – a small data centre when looked at from future scaling standards, but quite large by traditional standards. That has been “designed to allow the heat to be reused in a district heating system”, even though the infrastructure to connect it up to a district heating system hasn’t yet been built.

Some work is going on in this area already, in large part thanks to the collateral benefits of the adoption of advanced cooling techniques such as immersion cooling to try and reduce the need for energy-intensive traditional cooling in data centres.

The power density of next-generation GPUs like Rubin and Rubin Ultra, which is projected to require between 1.8kW and 3.6kW per module, with entire racks approaching 600 kW, means that air cooling isn’t a practical solution for these deployments. In their place, immersion cooling, where servers are submerged in dielectric fluids, has been proposed as an alternative. That recovered heat from immersion cooling could be redirected to use in nearby residential energy grids or industrial facilities, experts reckon.

It all shows how quickly the face of energy is changing as AI stretches, then changes, supplies. And with multiple companies racing to try and compete with one another to gain an edge, build up their customer base and keep a foothold in the market, the demand for more varied sources of supply seems like it’s going to continue to rise in the years to come.

Chris Stokel-Walker
Freelance Contributor

Chris Stokel-Walker is a Tom's Hardware contributor who focuses on the tech sector and its impact on our daily lives—online and offline. He is the author of How AI Ate the World, published in 2024, as well as TikTok Boom, YouTubers, and The History of the Internet in Byte-Sized Chunks.