Sign in with
Sign up | Sign in

Gaming Power Consumption Details

Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked
By , Igor Wallossek

Measuring Power Consumption

We’re using a current clamp to measure power consumption at the external PCIe power cable and, using a special PCB, directly at the PCIe slot. These measurements are recorded in parallel and in real time, added up for each second, and logged using multi-channel monitoring along with the respective voltages. All of this results in a representative curve over the span of 10 minutes. That's all we really need, since these cards reach their operating temperatures relatively quickly.

The curve isn’t just representative; it's also exact. Measuring system power introduces bias, since a number of factors can affect consumption other than the graphics card. A faster GPU might cause the CPU’s power consumption to go up as well, for example, since a limiting factor holding it back is gone.

We’re including three different GK110-based graphics cards in our measurements. Starting from scratch allows for a comparison that’s as objective as possible. We’re using the new GeForce GTX 780 Ti, the Titan, and Gigabyte's GTX 780 WindForce GHz Edition, which might be able to compete with the two other cards thanks to elevated clock rates.

Let’s first take a detailed look at each of the three cards. We’re benchmarking both boards with Nvidia's reference cooler twice: once with default settings and once at 70 °C GPU temperature. The latter necessitates a manual fan speed increase.

GeForce GTX 780 Ti

We start with a look at the frequencies, which might help us explain the somewhat unexpected differences in power consumption later.

Even under full load, the GeForce GTX 780 Ti balances its frequencies well. Consequently, its power consumption is similar in the two scenarios. Nvidia has raised its target temperature target from 80 to 83 °C, which results in a fan RPM that's a little bit higher. Still, the shape of the curve shows how the power consumption decreases once the card backs off of its GPU Boost clock rates.

Things look different when the fan RPM is pushed up. We sought to achieve a 70 °C GPU temperature by setting Nvidia's fan speed to 80% duty cycle, which yields additional performance. We’ll take a closer look at this difference a little later in our efficiency section. For now, here’s a nicely shaped curve:

GeForce GTX Titan

Next up: the former champion. With a temperature target of only 80 °C and a fan that spins only half as fast, the Titan faces an uphill battle. Let’s first take a look at the frequencies:

The difference is almost scary to behold, suggesting the Titan's fan could have probably been pushed a little harder. Aiming for a 70 °C GPU temperature using 80-percent fan speed, GeForce GTX Titan lives up to its name and can even show off its GPU Boost feature a bit. So, what does the card’s power consumption look like after its clock rates are uncorked by pushing a lot of air across its heat sink? First, a look at the stock settings:

Power consumption drops alongside clock rate, which also negatively impacts game performance. Again, we'll evaluate this phenomenon's effect on efficiency shortly.

How about when we dramatically ramp up cooling? GeForce GTX Titan puts its pedal to the medal and pulls quite a bit more power.

This is just a look at power, so all we can tell from these charts is that draw increases by 18 W. Our hope would be that you also get a corresponding performance boost, too. We'll see shortly.

Gigabyte GTX 780 Windforce GHz Edition

The round-up of GK110-based boards is completed by Gigabyte's brand new GTX 780 WindForce GHz Edition. This card features fewer CUDA cores, but they're running at higher clock rates. Is that enough of a compromise to keep a lower-cost, overclocked graphics card competitive? We've seen in the past that GK110’s sweet spot is under 1000 MHz. However, there's also a new stepping of the chip available, and Gigabyte's offering does facilitate a completely consistent frequency, even under load, thanks to its excellent cooler. The card is naturally more expensive than other GTX 780 boards, so the company has to hope it does battle based on elevated clock rates.

Gigabyte's GTX 780 WindForce GHz Edition manages to hold a core frequency of almost 1180 MHz. This is reflected in our power consumption measurements, though.

We see an average power draw of 226 W, putting the Gigabyte card at the same level as our more aggressively-cooled GeForce GTX 780 Ti, and 4 W beyond the 780 Ti's stock configuration.

Display all 283 comments.
This thread is closed for comments
Top Comments
  • 37 Hide
    expl0itfinder , November 7, 2013 6:18 AM
    Keep up the competition. Performance per dollar is the name of the game, and the consumers are thriving in it right now.
  • 27 Hide
    bjaminnyc , November 7, 2013 6:56 AM
    Excluding the possibility of bias, it's important to note the various performance results from one site to another. Tom's has the 780Ti winning the majority of benches while others have the 290x on top for the same applications. I believe this is representative of real world end user scenarios. Individual cards and total system variances IMO will result in the 780ti and 290x performing pretty much on par at higher resolutions. Therefore it really comes down to prices or preference but I don't know too many smart people who choose to waste $s ever even 1%'s. Win for AMD.
  • 22 Hide
    Lord_Kitty , November 7, 2013 6:15 AM
    Can't wait for fanboy wars! Its going to be fun to watch.
Other Comments
  • 22 Hide
    Lord_Kitty , November 7, 2013 6:15 AM
    Can't wait for fanboy wars! Its going to be fun to watch.
  • 17 Hide
    tomc100 , November 7, 2013 6:16 AM
    At $700, AMD has nothing to worry about other than the minority of enthusiast who are willing to pay $200 more for the absolute fastest. Also, when games like Battlefield 4 uses mantle the performance gains will be eroded or wiped out.
  • 37 Hide
    expl0itfinder , November 7, 2013 6:18 AM
    Keep up the competition. Performance per dollar is the name of the game, and the consumers are thriving in it right now.
  • 14 Hide
    alterecho , November 7, 2013 6:20 AM
    I want to see cooler as efficient as the 780 ti, on the 290X, and the benchmarks be run again. Something tells me 290X will perform similar or greater than 780ti, in that situation.
  • 6 Hide
    ohim , November 7, 2013 6:21 AM
    Price vs way too few more fps than the rival will say a lot no matter who gets the crown, but can`t wonder to imagine the look on the face of the guys who got Titans for only few months of "fps supremacy" at insane price tags :) 
  • 9 Hide
    bjaminnyc , November 7, 2013 6:22 AM
    2x R9 290's for $100 more will destroy the 780Ti. I don't really see where this logically fits in a competitively priced environment. Nice card, silly price point.
  • -3 Hide
    Innocent_Bystander , November 7, 2013 6:28 AM
    "Hawaii-based boards delivering frame rates separated by double-digit percentages, the real point is that this behavior is designed into the Radeon R9 290X. "

    It could also come down to production variance between the chips. Seen in before in manufacturing and it's not pretty. Sounds like we're starting to hit the ceiling with these GPUs... Makes me wonder what architectural magic they'll come up with next.

    IB
  • -1 Hide
    bjaminnyc , November 7, 2013 6:30 AM
    2x R9 290's for $100 more will destroy the 780Ti. I don't really see where this logically fits in a competitively priced environment. Nice card, silly price point.
  • -3 Hide
    Deus Gladiorum , November 7, 2013 6:35 AM
    I'm going to build a rig for a friend and was planning on getting him the R9 290, but after the R9 290 review I'm quite hesitant. How can we know how the retail version of that card performs? Any chance you guys could pick one up and test it out? Furthermore, how can we know Nvidia isn't pulling the same trick: i.e. giving a press card that performs way above the retail version?
  • 10 Hide
    americanbrian , November 7, 2013 6:37 AM
    Hoping to get a response this time. I am wondering if AA has any place in the ultraHD gaming world. I suspect that it gets cranked to 16x on ultra settings and I wonder if this actually is discernable witht he pixel density being so high. It is not like many people can spot a jagged edged curve when the "jag" is microns big.

    If it has a negligible impact on what it looks like I am wondering how performance is with single cards on ultraHD screens WITHOUT ANTI-ALIASING. Please could you investigate? or point me to somewhere that has. Cheers all!
  • 1 Hide
    catswold , November 7, 2013 6:39 AM
    EVGA already has the "SC" rated cards both with the default cooler and their ACX cooler.

    Apples to apples it looks like the 780 ti will remain faster than the 290x even after we begin to see custom cooling AMD cards . . . but at a high premium.
  • 3 Hide
    rolli59 , November 7, 2013 6:40 AM
    Good buy compared to Titan but not the $500cards is what I read out of this.Good performance but questionable value.
  • 4 Hide
    eklipz330 , November 7, 2013 6:40 AM
    what i can't understand is how people can manage to stay loyal to the green team especially when they've been using monopolistic tendencies when it comes to pricing their cards... seriously, dropping a cards price point at the snap of a finger by hundreds of dollars, and they're still profiting like monsters i bet.

    and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
  • 15 Hide
    aizatvader , November 7, 2013 6:44 AM
    sigh....it's too expensive compared to the 290x and the 290 for the performance.Slap a waterblock on the 290x and this card and overclock both of them to the limit and we will se which one is better.Still,i'm not gonna pay an extra $300 for this card over the 290x
  • 3 Hide
    aizatvader , November 7, 2013 6:46 AM
    sorry.i mean the r9 290
  • -3 Hide
    rmpumper , November 7, 2013 6:50 AM
    One thing is certain - 290X is completely irrelevant. Either get a lot cheaper 290 with the same performance or expensive 780Ti with better performance.
  • 8 Hide
    Anomandaris , November 7, 2013 6:55 AM
    This seems to be the bottom line... get the 780ti if you absolutely want the best and have money to burn (or wait a bit to see if they will indeed release the higher memory ones) and 290 for money vs performance.
  • 27 Hide
    bjaminnyc , November 7, 2013 6:56 AM
    Excluding the possibility of bias, it's important to note the various performance results from one site to another. Tom's has the 780Ti winning the majority of benches while others have the 290x on top for the same applications. I believe this is representative of real world end user scenarios. Individual cards and total system variances IMO will result in the 780ti and 290x performing pretty much on par at higher resolutions. Therefore it really comes down to prices or preference but I don't know too many smart people who choose to waste $s ever even 1%'s. Win for AMD.
  • 0 Hide
    ojas , November 7, 2013 6:59 AM
    Quote:
    I'm going to build a rig for a friend and was planning on getting him the R9 290, but after the R9 290 review I'm quite hesitant. How can we know how the retail version of that card performs? Any chance you guys could pick one up and test it out? Furthermore, how can we know Nvidia isn't pulling the same trick: i.e. giving a press card that performs way above the retail version?

    Well, it is possible, but highly unlikely, given that Nvidia has a hard defined minimum clock rate, and a much narrower range...plus this is a reference board, it's fully possible that a retail card with a custom cooler will perform much higher (like the Gigabyte 780 in this article).

    And, it's not an issue with the Titan or the 780, which are both based on GK110...which has been out for months, and has stable drivers.

    Quote:
    what i can't understand is how people can manage to stay loyal to the green team especially when they've been using monopolistic tendencies when it comes to pricing their cards... seriously, dropping a cards price point at the snap of a finger by hundreds of dollars, and they're still profiting like monsters i bet.

    and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.

    What i can't understand is why this has to be a ****ing war.

    When they had no competition, they charged a lot of money for their top end cards. No one was forced to buy these overpriced cards. If no one bought them, they'd drop prices. When AMD released solid competition, they dropped prices.

    That's how the market works. If you think AMD wouldn't do the same, well, what can i say..

    AMD haven't been in a position to do that for a long time on either the GPU or CPU front, which is why they haven't.

    When they tried to release an $800 FX CPU (this is without a monopoly or lead in the market, btw), no one bought it, and AMD had to drop prices by more than half.
Display more comments