Skip to main content

Graphics Card Power Consumption Tested: Which GPUs Slurp the Most Juice?

(Image credit: Shutterstock)

How much power do the best graphics cards use? It's an important question, and while the performance we show in our GPU hierarchy is useful, one of the true measures of a GPU is how efficient it is. To determine GPU power efficiency, we need to know both performance and power use. Measuring performance is relatively easy, but measuring power can be complex. We're here to press the reset button on GPU power measurements and do things the right way — plus it's good preparation for AMD Big Navi, Nvidia Ampere, and Intel Xe Graphics.

There are various ways to determine power use, with varying levels of difficulty and accuracy. The easiest approach is via software like GPU-Z, which will tell you what the hardware reports. Alternatively, you can measure power at the outlet using something like a Kill-A-Watt power meter, but that only captures total system power, including PSU inefficiencies. The best and most accurate means of measuring the power use of a graphics card is to measure power draw in between the power supply (PSU) and the card, but it requires a lot more work.

We've been using GPU-Z for the past six months, but it has some clear inaccuracies, so it's time to go back to doing things the right way. And by "right way," we mean measuring in-line power consumption using hardware devices. Specifically, we're using Powenetics software in combination with various monitors from TinkerForge. You can read our Powenetics project overview for additional details.

I've spent the past couple of weeks soldering together the necessary bits and pieces, followed by testing. Let me just say, soldering is not for the faint of heart. I managed not to burn myself (barely), and everything works, but it was way more difficult than building a PC. Go ahead, flame me if you think soldering is fun — I'd rather go to the dentist. It would certainly take a lot less time. Also, my test bed now has an alarming number of wires coming out of it. But I digress.

The main problem with GPU-Z is that it's prone to 'cheating' of sorts. Nvidia's GPU-Z power metrics are reasonably accurate, particularly on more recent Turing GPUs. However, AMD GPUs only report GPU power use — the rest of the graphics card, including VRAM, VRMs, etc. aren't part of the equation. How big of a difference does that make? According to our renewed Powenetics testing, many of AMD's Navi GPUs have graphics card power consumption that's 25-35W higher than just the GPU alone … and the less said about Polaris and Vega, the better. (But don't worry, we have the charts! Oh boy, do we have some charts.)

Since we have a backlog of recent graphics card reviews that used a different method of reading power, we're taking this opportunity to set the record straight. How much power does an AMD Radeon RX 5600 XT or RX 5500 XT really use — and is it more or less than the competing Nvidia parts? Now we can definitively answer that question. We're also testing previous generation hardware, more as a point of reference, so we'll have GTX 10-series and AMD Polaris and Vega as well (and I may add a few earlier cards when I get some time). 

Image 1 of 2

Powenetics

(Image credit: Tom's Hardware)
Image 2 of 2

Powenetics

(Image credit: Tom's Hardware)

Test Setup

We're using our standard graphics card testbed for these power measurements, and it's what we'll use on future graphics card reviews. It consists of an MSI MEG Z390 Ace motherboard, Intel Core i9-9900K CPU, NZXT Z73 cooler, 32GB Corsair DDR4-3200 RAM, a fast M.2 SSD, and the other various bits and pieces you see to the right. This is an open test bed, because the Powenetics equipment essentially requires one.

There's a PCIe x16 riser card (which is where the soldering came into play) that slots into the motherboard, and then the graphics cards slot into that. This is how we accurately capture actual PCIe slot power draw, from both the 12V and 3.3V rails. There are also 12V kits measuring power draw for each of the PCIe Graphics (PEG) power connectors — we cut the PEG power harnesses in half and run the cables through the power blocks. RIP, PSU cable.

Powenetics equipment in hand, I set about retesting all of the current and previous generation GPUs I could get my hands on. Mostly I tested reference cards, at least for higher-end AMD and Nvidia GPUs. However, reference models don't always exist for budget and mid-range GPUs. I've included a few additional GPUs as well as points of reference, and of course all future GPUs will be tested using the same approach. Here's the list of what we've tested:

From AMD, I have reference Radeon RX 5700 XT and 5700 cards, along with the Radeon VII, Vega 64 and Vega 56, but AMD doesn't do 'reference' models on most other GPUs. I've also included a couple of non-reference cards for comparison, and as we'll see, there's some variation between different models of the same GPU. We'll include third party cards in our results in future reviews as well, so this is more the baseline measurement for current GPUs.

For Nvidia, everything from the RTX 2060 and above is a reference Founders Edition card — which includes the 90 MHz overclock and slightly higher TDP on the non-Super models — while the other Turing cards are all AIB partner cards. A few run at reference clocks, and others come with modest factory overclocks, which is basically the same as the non-reference AMD models. Previous generation GTX 10-series cards are also Founders Edition models, except for the 1060 3GB and lower that use partner cards.

Update: I've added several older GPUs, which is basically everything I have available for testing. The legacy cards are Nvidia's GTX 980 Ti, 980, 970, and 780, along with the AMD R9 Fury X and R9 390.

Note that all of the cards are running 'factory stock,' meaning there's no manual overclocking or undervolting is involved. Yes, the various cards might run better with some tuning and tweaking, but this is the way the cards will behave if you just pull them out of their box and install them in your PC.

Our actual testing remains the same as recent reviews. We loop the Metro Exodus benchmark five times at 1440p ultra, and we also run Furmark for ten minutes. These are both demanding tests, and Furmark can push some GPUs beyond their normal limits, though the latest models from AMD and Nvidia both tend to cope with it just fine. We're only focusing on power draw for this article, as the temperature, fan speed, and GPU clock results continue to use GPU-Z to gather that data. 

(Image credit: Tom's Hardware)

GPU Power Use While Gaming: Metro Exodus 

Due to the number of cards being tested, we have multiple charts. The overall power chart will show average power consumption during the approximately 10 minute long test — it's actually 15 seconds shy of 10 minutes, if we're being precise. This chart does not include the time in between test runs, where power use dips for about 9 seconds, so it's a realistic view of the sort of power use you'll see when playing a game for hours on end.

Besides the bar chart, we have separate line charts roughly segregated into budget, midrange and high-end categories, each with up to 12 GPUs. These show real-time power draw over the course of the benchmark using data from Powenetics. The 12 GPUs per chart limit is to try and keep the charts mostly legible, and the division of what GPU goes on which chart is somewhat arbitrary. We've tried to group GPUs in a sensible fashion, though we couldn't fit every GPU on the ideal chart. (There's no clean break between 'budget' and 'mainstream,' plus previous generation GPUs mean we have more than 12 GPUs in some categories.)

Image 1 of 5

Graphics Card Power Testing charts

(Image credit: Future)
Image 2 of 5

Metro Exodus Testing

(Image credit: Tom's Hardware)
Image 3 of 5

Metro Exodus Testing

(Image credit: Tom's Hardware)
Image 4 of 5

Metro Exodus Testing

(Image credit: Tom's Hardware)
Image 5 of 5

Graphics Card Power Testing charts

(Image credit: Future)

In the overall standings, where less power is better, it's pretty easy to see how far AMD fell behind Nvidia prior to the Navi generation GPUs. The various Vega and Polaris AMD cards use significantly more power than their Nvidia counterparts. Even now, with 7nm Navi GPUs going up against 12nm GPUs, AMD is only roughly equal to Nvidia. How things will change with upcoming Nvidia Ampere and AMD Big Navi launches is something we're definitely looking forward to seeing, what with AMD's claims of 50% improvements in performance per watt with RDNA 2.

Digging into the line charts, in the first grouping of slower / lower power GPUs, Nvidia's GTX 1650 series cards come in below the competing AMD RX 5500 XT models, though performance is certainly still a factor when choosing between the cards. It's interesting that the 4GB and 8GB 5500 XT are basically identical in power use — more VRAM doesn't inherently mean substantially higher power use. Meanwhile, the RX 590 and RX 570 4GB are a big step up in power consumption, while the RX 560 4GB is the only card in our test suite that doesn't include a 6-pin or 8-pin PEG power connector and thus remains below 75W.

The second line chart highlights the big jump AMD saw with its Navi GPUs. The three GTX 1660 models (vanilla, Super and Ti) aren't on the same chart as the RX 5500 XT models, but they're pretty much tied for power use. Stepping up in performance, the RX 5700 and RX 5600 XT are right in the thick of things compared to Turing and the Vega 56. And speaking of Vega 56, the PowerColor model comes with a modest overclock and a much larger cooler, which allows it to remain cool. That means higher clocks and higher power use, along with better framerates.

Last, the highest performance cards can draw a lot of power, with Vega 64 actually surpassing the reference model Radeon VII. The RX 5700 XT meanwhile delivers nearly the same performance as the VII while using substantially less power. It's also interesting to see that the previous Nvidia Pascal cards (GTX 1080 and GTX 1070 Ti) still use less power than their 'replacement' Turing models, and slightly more power than the 2060 Super. That's expected, since both architectures use TSMC's similar 12nm / 16nm process technology. Moving to 7nm ought to provide a substantial boost in power efficiency and performance for Nvidia's next generation GPUs.

(Image credit: Tom's Hardware)

GPU Power with FurMark 

Let’s put gaming behind us and move on to the FurMark test that, as we've frequently pointed out, is basically a worst-case scenario for power use. Some of the GPUs tend to be more aggressive about throttling with FurMark, while others go hog wild and dramatically exceed official TDPs. Few if any games will tax a GPU quite like FurMark, though things like cryptocurrency mining can come close. The chart setup is the same as above, with a high level overview followed by three detailed line charts. 

Image 1 of 5

Graphics Card Power Testing charts

(Image credit: Future)
Image 2 of 5

FurMark Testing

(Image credit: Tom's Hardware)
Image 3 of 5

FurMark Testing

(Image credit: Tom's Hardware)
Image 4 of 5

FurMark Testing

(Image credit: Tom's Hardware)
Image 5 of 5

Graphics Card Power Testing charts

(Image credit: Future)

In the overall chart, most of AMD's GPUs move toward the bottom — and lower is better here, so that's not good. Radeon VII power use jumps 30W compared to the Metro Exodus testing, and the Vega and Polaris GPUs see a big spike as well. The RX 570 4GB (an MSI Gaming X model) actually exceeds the official power spec for an 8-pin PEG connector with FurMark, pulling nearly 180W. That's thankfully the only GPU to go above spec, for the PEG connector(s) or the PCIe slot, but it does illustrate just how bad things can get in a worst-case workload.

The remaining GPUs, meaning AMD's latest Navi parts and Nvidia's Turing and Pascal chips, mostly don't change power use too much. The various Nvidia RTX cards are all within about 5W of the Metro Exodus numbers, and the same applies to Pascal. There are only a few exceptions: The GTX 1660 power use under FurMark jumps by 15W, actually surpassing the power use of the 1660 Ti, and the GTX 1060 3GB, 1050 Ti, and 1050 all see larger jumps as well.

AMD's Navi GPUs split the difference between Turing and Vega, but the RX 5500 XT cards are the worst of the bunch, jumping 45W. The 5600 XT shows a smaller 20W delta, the RX 5700 only changes by 10W, and the RX 5700 XT is only a 3W difference.

It's interesting that the budget chips from both companies seem to get hit a lot harder by FurMark than by games, and perhaps it's just a case of the budget models not being designed to detect and throttle FurMark. We've checked other settings on the budget GPUs in Metro, though, and can't hit the same power levels as FurMark. Part of the problem may simply be that demanding games push beyond the GPUs' capabilities whereas synthetic loads like FurMark are able to max out power draw.

One thing we're not showing are the GPU-Z power figures, though we have them. While Nvidia's Pascal GPUs have real power use typically within 5W of the GPU-Z number, and Turing GPUs are practically bang on, AMD's Navi and Polaris GPUs have total board power use that's 25-35W higher than the GPU-only power use shown in GPU-Z. And Vega? There's up to an 80W delta between GPU-Z and Powenetics with the Vega 64. Hopefully AMD reconsiders how it reports power in future GPUs, as it would be far more helpful to report board power rather than only GPU power.

There's not much else to say about the line charts, other than noting that power use is higher and this time most of the GPUs are hitting close to their rated power and then staying there. There are no major fluctuations, except on the two Vega 64 cards. It's also interesting how the vanilla GTX 1660 with GDDR5 uses more power than the Super and Ti, showing one of the other benefits of GDDR6 besides bandwidth.

Analyzing GPU Power Use and Efficiency 

It's worth noting that we're not showing or discussing GPU clocks, fan speeds or GPU temperatures in this article. Power, performance, temperature and fan speed are all interrelated, so a higher fan speed can drop temperatures and allow for higher performance and power consumption. Alternatively, a card can drop GPU clocks in order to reduce power consumption and temperature. We dig into this in our individual GPU and graphics card reviews, but we just wanted to focus on resetting the power charts for now. If you see discrepancies between previous and future GPU reviews, this is why.

Looking forward, the switch back to in-line power measurements also prepares us for the upcoming launches of AMD RDNA 2, Nvidia Ampere and Intel Xe Graphics cards. Hopefully AMD and Nvidia improve even further on efficiency, and we're ready to test when the cards arrive. Intel meanwhile is something of a wild card. Current Intel integrated graphics can be very power efficient, but they're also pathetically slow. What will happen when Intel attempts to make a dedicated GPU, and will Intel report accurate power consumption to utilities like GPU-Z? Of course, with Powenetics we won't have to worry about that.

We can now properly measure the real graphics card power use and not be left to the whims of the various companies when it comes to power information. It's not that power is the most important metric when looking at graphics cards, but if other aspects like performance, features and price are the same, getting the card that uses less power is a good idea. Now bring on the new GPUs!

Here's the final high-level overview of our GPU power testing, showing relative efficiency in terms of performance per watt:

GPU Efficiency - Relative Performance Per Watt
Graphics Card
Efficiency Score Where to Buy
GTX 1660 Ti100.0%View Deal at Newegg
RX 570096.8%View Deal at BHPhoto
GTX 1660 Super96.7%View Deal at BHPhoto
GTX 1650 GDDR696.3%View Deal at BHPhoto
RX 5600 XT93.8%View Deal at Best Buy
RTX 2060 Super FE93.8%View Deal at BHPhoto
RTX 2080 FE92.1%View Deal at BHPhoto
GTX 1650 Super91.7%View Deal at Newegg
RTX 2060 FE90.1%View Deal at Newegg
GTX 1050 Ti89.5%View Deal at Newegg
RTX 2070 Super FE88.7%View Deal at Best Buy
GTX 166088.2%View Deal at BHPhoto
RTX 2070 FE88.1%View Deal at Best Buy
RTX 2080 Ti FE88.0%View Deal at Best Buy
GTX 1070 FE85.9%View Deal at Best Buy
GTX 165085.7%View Deal at BHPhoto
RTX 2080 Super FE84.9%View Deal at BHPhoto
RX 5700 XT83.7%View Deal at Best Buy
GTX 1080 FE82.9%View Deal at Newegg
GTX 1060 6GB FE76.9%View Deal at OfficeDepot
GTX 1070 Ti FE76.8%View Deal at Newegg
RX 5500 XT 8GB76.6%View Deal at Amazon
GTX 105075.5%View Deal at Newegg
GTX 1080 Ti FE74.5%View Deal at Newegg
GTX 1060 3GB74.3%View Deal at Newegg
RX 5500 XT 4GB69.2%View Deal at Newegg
Radeon VII68.5%View Deal at Best Buy
RX Vega 5665.2%View Deal at Newegg
RX 560 4GB64.1%View Deal at Newegg
GTX 98052.2%
GTX 980 Ti51.3%
GTX 97051.1%
RX Vega 6449.2%View Deal at Amazon
RX 59047.4%View Deal at BHPhoto
RX 570 4GB45.3%View Deal at Newegg
R9 Fury X39.1%
R9 39031.8%

This table combines the performance data for all of the tested GPUs with the power use data discussed above, sorts by performance per watt, and then scales all of the scores relative to the most efficient GPU. It's a telling look at how far behind AMD was, how far it's come with the Navi architecture, and the work that yet remains.

Efficiency isn't the only important metric for a GPU, and performance definitely matters. However, cards often pointed to as being extremely good bargains can have a dark side. The Radeon RX 570 4GB, for example, has been one of the top picks for budget GPUs for the past year. Often priced at only $120, it delivers decent gaming performance. Power use on the other hand can be roughly double that of newer cards that deliver similar performance, like the GTX 1650 GDDR6, and it sits near the bottom of our relative performance per watt table.

The most efficient GPUs end up as a mix of AMD's Navi 10 cards and Nvidia's Turing chips, though Nvidia has both an efficiency and numerical advantage. Where AMD only has five different Navi parts right now, Nvidia has seven RTX Turing GPUs and six more GTX Turing chips. The RX 5700 places second, just behind the GTX 1660 Ti. Balanced mid-range GPUs and even budget chips make for a potent combination. Nvidia's best ray tracing RTX card is the 2060 Super, which ranks sixth, while AMD's higher clocked RX 5700 XT is clear down in 18th place.

  • King_V
    Thanks for this.

    I have to admit, and maybe it's just the cooling limitation, but I did not at all expect the Vega 56 and Vega 64 to measure pretty much exactly what their official TDP numbers of 210W and 295W are.

    and I may add a few earlier cards when I get some time
    Don't taunt me like this, LOL
    Reply
  • salgado18
    and I may add a few earlier cards when I get some time
    Geforce 8800GT? ;)
    Reply
  • JarredWaltonGPU
    King_V said:
    Thanks for this.

    I have to admit, and maybe it's just the cooling limitation, but I did not at all expect the Vega 56 and Vega 64 to measure pretty much exactly what their official TDP numbers of 210W and 295W are.


    Don't taunt me like this, LOL
    LOL. I don't have all of the older generation GPUs, and in fact I'm missing a whole lot of potentially interesting models. But I do have 980 Ti, 980, 970, Fury X, and R9 390 that might be fun to check.

    Yeah, what the heck -- I'm going to stuff in the 900 series while I work on some other writing. Thankfully, it's not too hard to just capture the data in the background while I work. Swap card, run Metro for 10 minutes, run FurMark for 10 minutes. Repeat. Check back to see new charts in... maybe by the end of the day. (I have I think a 770 and 780 as well sitting around. And R9 380 if I'm feeling ambitious.)

    On Vega, the one thing that's shocking to me is the gap between GPU-Z and Powenetics. The Vega 64 appears to use about 80W on "other stuff" besides the GPU, if GPU-Z's power numbers are accurately reporting GPU-only power. The Vega 56 isn't nearly as high -- about a 45W difference between GPU-Z and Powenetics. That suggests the 18% increase in HBM2 clocks causes about a 75% increase in HBM2 power use (though some of it is probably VRMs and such as well).

    Full figures:
    Vega 64 Powenetics: 296.7W avg power during Metro, GPU-Z says 219.2W.
    Vega 56 Powenetics: 209.4W avg power during Metro, GPU-Z says 164.6W.
    Reply
  • JarredWaltonGPU
    salgado18 said:
    Geforce 8800GT? ;)
    Why stop there? I really do sort of wish I had an FX 5900 Ultra floating around. :D
    Reply
  • bit_user
    Thanks for doing this!

    JarredWaltonGPU said:
    I do have 980 Ti, ... Fury X, and R9 390 that might be fun to check.
    These three are most interesting to me. I have a GTX 980 Ti and the Fury X was its AMD counterpart.

    What's intriguing about the R9 390 is that it was the last 512-bit card. Hawaii was hot, in general (275 W?).
    Reply
  • King_V
    JarredWaltonGPU said:
    LOL. I don't have all of the older generation GPUs, and in fact I'm missing a whole lot of potentially interesting models. But I do have 980 Ti, 980, 970, Fury X, and R9 390 that might be fun to check.

    Yeah, what the heck -- I'm going to stuff in the 900 series while I work on some other writing. Thankfully, it's not too hard to just capture the data in the background while I work. Swap card, run Metro for 10 minutes, run FurMark for 10 minutes. Repeat. Check back to see new charts in... maybe by the end of the day. (I have I think a 770 and 780 as well sitting around. And R9 380 if I'm feeling ambitious.)

    On Vega, the one thing that's shocking to me is the gap between GPU-Z and Powenetics. The Vega 64 appears to use about 80W on "other stuff" besides the GPU, if GPU-Z's power numbers are accurately reporting GPU-only power. The Vega 56 isn't nearly as high -- about a 45W difference between GPU-Z and Powenetics. That suggests the 18% increase in HBM2 clocks causes about a 75% increase in HBM2 power use (though some of it is probably VRMs and such as well).

    Full figures:
    Vega 64 Powenetics: 296.7W avg power during Metro, GPU-Z says 219.2W.
    Vega 56 Powenetics: 209.4W avg power during Metro, GPU-Z says 164.6W.

    I'd most certainly be curious as to the R9 380 results, but that's because, if I understand it correctly, it's basically a slightly overclocked R9 285. Since I had an R9 285 (Gigabyte's Windforce OC version), well, the curiosity is there, LOL

    And yep, with the Vegas, I was talking about the Powenetics numbers. I had assumed they'd blow past their official TDP numbers, but no, they're right in line, give or take 1-2 watts.
    Reply
  • JarredWaltonGPU
    King_V said:
    I'd most certainly be curious as to the R9 380 results, but that's because, if I understand it correctly, it's basically a slightly overclocked R9 285. Since I had an R9 285 (Gigabyte's Windforce OC version), well, the curiosity is there, LOL

    And yep, with the Vegas, I was talking about the Powenetics numbers. I had assumed they'd blow past their official TDP numbers, but no, they're right in line, give or take 1-2 watts.
    So, my old R9 380 4GB card is dead. Actually, it works, but one of the fans is busted and it crashed multiple times in both Metro Exodus and FurMark, so I'm not going to put any more effort into that one. RIP, 380...

    Anyway, I didn't show GPU clockspeeds, which is another dimension of the Vega numbers. The throttling in some of the tests is ... severe. This is why people undervolt, because otherwise the power use is just brutal on Vega. But undervolting can create instability and isn't a panacea.

    In Metro, the Vega 56 averages GPU clocks of 1230MHz -- not too bad, considering the official spec is 1156MHz base, 1471MHz boost. Vega 64 averages 1494MHz, again with base of 1247MHz and boost of 1546MHz. But FurMark... Vega 56 drops to 835MHz average clocks, and Vega 64 is at 1270MHz. That's on 'reference' models. The PowerColor V56 is higher power and clocks, obviously. MSI V64 is also better, possibly just because of binning and being made a couple of months after the initial Vega launch.
    Reply
  • salgado18
    JarredWaltonGPU said:
    Why stop there? I really do sort of wish I had an FX 5900 Ultra floating around. :D
    That would be awesome, right? If only you could get an AGP motherboard :ROFLMAO:

    Hey, throw some emails around, you might get some old cards borrowed. It would be very interesting to see the evolution in efficiency.
    Reply
  • gdmaclew
    There must be hundreds of thousands of RX580s out there but you chose NOT to test them?
    Perhaps you have other articles that do that?
    You have something against Polaris?
    Reply
  • King_V
    gdmaclew said:
    There must be hundreds of thousands of RX580s out there but you chose NOT to test them?
    Perhaps you have other articles that do that?
    You have something against Polaris?

    Oh, OBVIOUSLY he has something against Polaris, which is why he tested the RX 570 and RX 590. Clearly. :rolleyes:
    Reply