Nvidia's newest architecture presents us with a whole new set of challenges for measuring power consumption. If the maximum of all four possible rails are to be measured exactly (to find out Maxwell’s power consumption reduction secrets), then a total of eight analog oscilloscope channels are needed. This is because voltage and current need to be recorded concurrently at each rail in real-time. If the voltages are measured and then used later, the result may be inaccurate. So, how did we solve this problem?
We enlisted the help of HAMEG (Rohde & Schwarz) to search for a solution with us. In the end, we had to use two oscilloscopes in parallel (a master-slave triggered setup), allowing us to accurately measure and record a total of eight voltages or currents at the same time with a temporal resolution down to the microsecond.
The measurement intervals need to be adjusted depending on the application in question, of course, in order to avoid drowning in massive amounts of data. For instance, when we generate the one-minute graphs for graphics card power consumption with a temporal resolution of 1 ms, we have the oscilloscope average the microsecond measurements for us first.
We use a riser card on the PCIe slot (PEG) to measure power consumption directly on the motherboard for the 3.3 and 12 V rails. The riser card was built specifically for this purpose.
In addition, we separately measure the voltage and current at each of the two individual PCIe power connectors.
| Test Methodology | No-contact current measurement at all rails Direct voltage measurement IR real-time monitoring |
|---|---|
| Test Equipment | 2 x HAMEG HMO3054, 500 MHz Four-Channel Oscilloscope with Data Logger 4 x HAMEG HZO50 Current Probe 4 x HAMEG HZ355 (10:1 Probe, 500 MHz) 1 x HAMEG HMC8012 DSO with Data Logger 1 x Optris PI450 80 Hz Infrared Camera + PI Connect |
| Test System | Intel Core i7-5960X, 4.2 GHz 16 GB G.Skill Ripjaws DDR4-2666 (4 x 4 GB) MSI X99 Gaming 7 2 x Transcend SSD370 (System, Applications + Data, Storage) be quiet! Dark Power Pro 1200 W Microcool Banchetto 101 |
Nvidia’s GPU Boost Accelerates Maxwell
Everything makes sense in theory, but we still want to know how Maxwell achieves better efficiency at this magnitude. Kepler already adjusted the GPU’s voltage quickly and exactly depending on its load and temperature, and AMD’s PowerTune did the same thing as well. It turns out that Maxwell refines the formula further. With its shaders fully utilized, the new architecture's advantage over Kepler practically vanishes. So, Maxwell depends on its superior ability to adjust to changing loads, and, consequently, it’s able to tailor the power consumption even better to the needs of the application in question. The more variance there is, the better Maxwell fares.
To illustrate, let’s take a look at how Maxwell behaves in the space of just 1 ms. Its power consumption jumps up and down repeatedly within this time frame, hitting a minimum of 100 W and a maximum of 290 W. Even though the average power consumption is only 176 W, the GPU draws almost 300 W when it's necessary. Above that, the GPU slows down.
Now, how do the PSU’s important 12 V rail's current and voltage behave under these conditions? We add them up for this purpose.
The PSU doesn’t supply a constant 12 V supply voltage. Spikes that empty the secondary capacitors and the slower power supply that tries to fill them back up again cause voltage fluctuations that the graphics card's five phases must take care of. All of these interactions make our measurements more complicated than a log of averages would suggest.
- Introducing GM204: There's A New Maxwell In Town
- New Features
- Nvidia GeForce GTX 980 Reference Card
- Gigabyte GTX 980 WindForce OC
- Gigabyte GTX 970 WindForce OC
- EVGA GTX 970 Superclock ACX 2.0
- Test System And Benchmarks
- Results: Battlefield 4 And Thief
- Results: Arma 3 And Grid Autosport
- Results: Assassin's Creed IV, Watchdogs, Far Cry 3
- A New Power Consumption Test Setup
- Power Consumption In Detail
- Power Consumption Overview
- Efficiency
- Temperatures And Noise
- Verdict







Good stuff here - but you guys were a bit slow on this one. Tom's Hardware is the first site I visit every morning. But with the delay of this article, I've been all over the net this morning on other sites that got their stuff out sooner.
I was hoping for more performance but the efficiency is quite nice. They just put pressure on the top end and gave us a price reduction, instead of overall performance gains.
Likely, we're going to see a Maxwell Titan equivalent come in the next year or so as these are a x04 much like Kepler with the 670/80s were and we're still going to be waiting to see what the x10 will be with the Maxwell architecture.
Good stuff here - but you guys were a bit slow on this one. Tom's Hardware is the first site I visit every morning. But with the delay of this article, I've been all over the net this morning on other sites that got their stuff out sooner.
That's some flat out insane price / performance ratio right there!
Same answer to both... no time.
We literally got the 970 for testing yesterday. The 980, we got the day before. We barely got the article out by this morning.
For those of you who want more info, we'll be spending more time with the GeForce GTX 980 and 970 in the weeks to come, don't you worry.
I'm waiting for the real Maximum Maxwell myself. Unimpressed with these xxx04 launches from Nvidia.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/
Depending on resolution, dual 970's are roughly equal to the 295x2 at only 2/3's the price.
The features that Nvidia has been rolling out constantly has been quite impressive, while many of them do not appeal to me at all, the sheer amount and in most cases quality of them is insane.
Well done Nvidia. Let's see what AMD responds with, my next main gaming machine purchase as of now are 2 x Nvidia GTX970.
EDIT : Im sure that its a good upgrade to my 2 x HD7950s
Same answer to both... no time.
We literally got the 970 for testing yesterday. The 980, we got the day before. We barely got the article out by this morning.
For those of you who want more info, we'll be spending more time with the GeForce GTX 980 and 970 in the weeks to come, don't you worry.
Thank you! That answered a question I had in a previous GTX980/970 tease on the live news feed.
Do you guys have this problem often with Nvidia? You always seem to have fewer Nvidia board partner variaty and slower review releases on Nvidia GPUs.
More so than other websites.
Same answer to both... no time.
We literally got the 970 for testing yesterday. The 980, we got the day before. We barely got the article out by this morning.
For those of you who want more info, we'll be spending more time with the GeForce GTX 980 and 970 in the weeks to come, don't you worry.
Thank you! That answered a question I had in a previous GTX980/970 tease on the live news feed.
Do you guys have this problem often with Nvidia? You always seem to have fewer Nvidia board partner variaty and slower review releases on Nvidia GPUs.
More so than other websites.
My bets are that no NVidia GPUs on the best GPU's for the $$ for the past several months earned Tom's a slight delay in the delivery of these GPUs.
My bets are that no NVidia GPUs on the best GPU's for the $$ for the past several months earned Tom's a slight delay in the delivery of these GPUs.
Nah, doubt Nvidia is that petty.
More publicity is exactly that, more publicity.
Tom's has been around a long time and is trusted by A LOT of people.
Doubt Nvidia would compromise the user base.