ChatGPT 5 power consumption could be as much as eight times higher than GPT 4 — research institute estimates medium-sized GPT-5 response can consume up to 40 watt-hours of electricity
But the figures rely on hardware and usage assumptions that may be inaccurate.

OpenAI's GPT-5 AI model is significantly more capable than predecessors, but it's also dramatically — about 8.6 times — more power-hungry than the previous GPT 4 version, according to estimates based on tests conducted by the University of Rhode Island's AI lab, reports The Guardian. However, OpenAI does not officially disclose the energy use of its latest model, which is part of what raises concerns about its overall energy footprint. That said, the published findings are just an estimate — and that estimate relies on a lot of estimates.
The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
The University based its report on its estimates that producing a medium-length, 1,000-token GPT-5 response can consume up to 40 watt-hours (Wh) of electricity, with an average just over 18.35 Wh, up from 2.12 Wh for GPT-4. This was higher than all other tested models, except for OpenAI's o3 (25.35 Wh) and Deepseek's R1 (20.90 Wh).
But it's important to point out that the lab's test methodology is far from ideal.
The team measured GPT-5’s power consumption by combining two key factors: how long the model took to respond to a given request, and the estimated average power draw of the hardware running it.
Since OpenAI has not revealed exact deployment details, the researchers had to guess at the hardware setup. They believe that OpenAI's latest AI model is likely deployed on Nvidia DGX H100 or DGX H200 systems hosted on Microsoft Azure. By multiplying the response time for a query by the hardware's estimated power draw, they arrived at watt-hour figures for different outputs, such as the 1,000-token benchmark they used for comparison.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The hardware estimation also accounted for non-GPU components (e.g., CPUs, memory, storage, cooling) and applied Azure-specific environmental multipliers such as Power Usage Effectiveness (PUE), Water Usage Effectiveness (WUE), and Carbon Intensity Factor (CIF). Of course, if Azure and OpenAI happen to use Nvidia's Blackwell hardware (which is up to four times faster), then the estimation is incorrect.
GPT-5 uses a mixture-of-experts design, so not all parameters are active for every request, which reduces power consumption for some short (or stupid) queries. But it does have a reasoning mode with extended processing time, which can raise power draw by five to 10 times for the same answer (i.e., beyond 40 Wh per query) — according to researchers like Shaolei Ren, cited by The Guardian.
While Rhode Island's AI labs' estimates can give us an idea of how the total power consumption of GPT 5 compares to previous-generation models, the absolute numbers may not be accurate at all. What we do know, though, is that AI data centers are leading to skyrocketing power bills in the U.S. already, and with the continued proliferation of the technology, all signs point to the power crunch becoming worse.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
-
razor512 Have they released any info on how large their AI model is? For example, can they get much of it to run locally on a user's PC, and have it automatically send work to their servers when a more sophisticated set of models and extensions are needed? It may help offset some power demands and server resources needed per user.Reply -
hannibal
That is the reason they try to move as much AI workload to local machines as possible. But it seems that local one is much much slover than the online version. At this moment big companies pays the electric bills, and collet user data to train theis AIs and also just for the user data itself, but one day there will be payment waiting, so developing local operations is in high priority even it is rather slow...razor512 said:Have they released any info on how large their AI model is? For example, can they get much of it to run locally on a user's PC, and have it automatically send work to their servers when a more sophisticated set of models and extensions are needed? It may help offset some power demands and server resources needed per user. -
JarredWaltonGPU This is the problem with university researchers and estimates. Is OpenAI actually using 45 GWh of electricity, purely for GPT-5 queries? It should be possible to determine appoximately how many data centers and such OpenAI uses, but without insider knowledge, it's impossible to say whether those GPUs and servers are being for:Reply
Running GPT-5
Running GPT-4 and older models
Running non-GPT models
Training new models
Running other infrastructure
Basically anything else45 GWh would mean that OpenAI is consistently using the equivalent of 1.875 GW of power, all day long, every day. That's easily the equivalent of a couple dozen large 75~100 megawatt data centers doing nothing other than running OpenAI. That figure might be viable, but even if OpenAI is using that much power, it's a safe bet that at present a large chunk of that power isn't currently being used to serve up GPT-5 responses.
Realistically? I'd guess no more than 10~20 percent is for GPT-5 inference. You could also argue that GPT-5 used probably thousands (tens of thousands) of GPUs for a couple of months to train the model. Again, that's just estimating, but obviously a ton of electricity gets consumed in the process. However, if that's factored into estimates at all, it would also mean the cost per query goes down over time, as the training power gets diffused across billions (trillions?) of queries.
Anyway, a quick estimate here. GPT-5 can respond to a typical request in about 17 seconds. That's based on me just running a query right now asking it to write me a short, funny story about training GPT-5. So, about 1000 tokens took 17 seconds. To get up to 18 Wh per query with that sort of napkin math, it would mean that the hardware used to respond to my query was consuming 3800 watts of power. Getting up to 3800 watts isn't hard. But Nvidia's GPUs are designed to run multiple concurrent workloads at the same time. If a single H100-based server (eight H100 GPUs) is running the query, it's probably also running a dozen other concurrent queries is my guess.
But I'll admit, I could be wrong. I'm not researching OpenAI power use or anything else. I just suspect that a lot of these estimates are more of a "worst-case" estimate than a real-world estimate. Probably both GPT-4 and GPT-5 average power use is a lot lower than these estimates on average, but proportionately GPT-5 likely does use 8X more power on average. -
jonaswox
Its like , 5 years ago every company was spending half the marketing budget on being percieved as "ESG FRIENDLY" because it attracked stockbuyers.Now we are drinking through paper straws, everyone has forgotten ESG, the private planes are litterally commuting the rich around like its nobodys business , dumping millions of litres of jetfuel into the air. The AI is soon consuming more power than all the households,JarredWaltonGPU said:This is the problem with university researchers and estimates. Is OpenAI actually using 45 GWh of electricity, purely for GPT-5 queries? It should be possible to determine appoximately how many data centers and such OpenAI uses, but without insider knowledge, it's impossible to say whether those GPUs and servers are being for:
and somehow your analysis ends up in the realm of ... "nothing to see here"
The modern west is a circus waiting to break down. We have speedrun into oligarchy and half the people like you be like "yo man stop being so xonspiracy boizzzz everything is just fine !!!!!!!". Nobody can afford a home :D People are litterally buying each other out of the market , with money they lend to be able to afford a home in the first place. Its the most systematic abuse of every single human who is not able to just provide security for the loans. We are told "its so that we can afford" , but really we are just enabling and accellerating the very dynamic pushing normal folks out of owning a home in the first place.
I find it rather depressing the amount of sloths making any change completely unfeasible because the conformity have gotten so over the top, that half the population will sloth around until they comfort and conformity is risking getting compromised. Machiavelli wrote a book about this 400 years ago, arguing that black/white statements are so much more comfortable to the voters, promising some sort of "order" in the chaos. As such, someone with grandiose claims will always have a vast advantage over anyone being honest about the complexity.
Today the biggest enemy of society is not the companies , or the politicians. They are just doing what everyone has always done. Looking out for themselves. The problem is the vast amount of western population who are so comfortable they will not realise the problem until it has gotten really bad. It is similar to how a privileged addict falls harder , because they will get so father away from reality, before feeling the consequences and thus hopefully be able to change. If you dont feel any consequence until you house is actually on fire, then it is to late to do much. -
bigdragon The takeaway from the article for me is that we're using the output of nuclear power plants to deliver incorrect, hallucinated results. I'm sure AI will be nice when it's refined and useful, but right now it appears to be sucking up such ridiculous resources that the return on investment promises won't come true. This power study amps up the risk AI poses not just to investors but also to the entire world. The AI bubble needs to pop.Reply
I think someone wrote a book about this that got turned into a movie. A civilization poured all their resources into building the best computer and it returned 42 as a result. -
kealii123 "8 times the power consumption"Reply
Thats not so bad, because the performance is x8 higher, right boys? 😭 -
SkyBill40 No wonder they're looking to restart Three Mile Island and build new nuclear plants.Reply -
dimar Might as well build space station and let it orbit around the sun for unlimited power. Let Musk company service it.Reply -
JRStern 40 watt-hours for a 30 second, 1000 word response? No way. Unless they amortize the power used to train the model, and build the building, and resettle all the Native Americans who used to live on the sacred grounds where they built the powerplant to South Carolina and the energy they used to build and operate the casino there for the last ten years.Reply
Oh, btw, freebie ChatGPT apparently rolled back GPT5, brought back GPT4o, and when you exceed the quota it puts you out cold for 24 hours instead of to a weaker model for 4 hours. SMH