AI datacenters in the US aren't 'running on coal' — but this dirty fuel has found favor for feeding demand spikes due to increased gas prices

Google
(Image credit: Google)

U.S. datacenters are increasingly relying on coal-fired electricity as expanding AI infrastructure drives up power needs and makes natural gas less affordable, reports The Register. However, in states with nuclear power plants, coal, slate, and oil are used to smooth peaks and lows of power, not to address the needs of prominent tech giants. Let us try to figure it out.

Training

A nuclear power plant is almost perfectly suited to supply a mass of power to an AI training datacenter because its steady, high-capacity output matches the data center’s continuous and predictable demand profile. Well, almost. During a training run, checkpoint intervals — when models are saved to storage — cause a major drop in GPU utilization as data is written and synchronized, these fluctuations are minor compared with the total load.

Meanwhile, as the nuclear plant's generation does not vary, so grid operators must compensate for sudden drops or spikes of usage, which is where coal and renewables come into play. Without them, the power flows steadily through high-voltage lines to transformers, which are meant to maintain stable voltage in homes and industry, but are not to handle massive loads.

In essence, nuclear energy provides the power that keeps multi-week training runs uninterrupted, but when the cards are down, coal and renewables do their job for the other consumers.

Inference

Things are different for inference, a stage where a trained model is put to use, generating outputs from new inputs rather than learning from data. AI inference consumes power in short, intense bursts as the system processes large chunks of forward computations at unpredictable intervals.

Each request — whether it is generating text, recognizing an image, or serving a recommendation — triggers thousands of matrix multiplications across AI accelerators, drawing a surge of electricity for a fraction of a second. When millions of requests arrive simultaneously, the combined load can climb sharply, stressing both the local power distribution and the cooling systems. Inference creates spiky, rapidly fluctuating power patterns that must be met with responsive grid supply or on-site energy storage. The power grid has to be rigid enough to support all types of customers, which is again where renewables and coal or slate come in.

As AI datacenters drive up U.S. electricity demand, coal generation has surged nearly 20%, according to Jefferies cited by The Register, though the role of coal is mostly to stabilize the grid rather than directly power AI clusters.

Summary

Nuclear plants provide the steady output ideal for multi-week AI training runs, while coal and renewables help balance fluctuations caused by checkpoints and other loads. During inference, when millions of user requests arrive unpredictably, the grid relies on flexible sources — including fossil fuels — to handle the short, sharp bursts of power demand that even advanced datacenters can’t smooth alone.

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.