Measuring Power Consumption
We’re using a current clamp to measure power consumption at the external PCIe power cable and, using a special PCB, directly at the PCIe slot. These measurements are recorded in parallel and in real time, added up for each second, and logged using multi-channel monitoring along with the respective voltages. All of this results in a representative curve over the span of 10 minutes. That's all we really need, since these cards reach their operating temperatures relatively quickly.
The curve isn’t just representative; it's also exact. Measuring system power introduces bias, since a number of factors can affect consumption other than the graphics card. A faster GPU might cause the CPU’s power consumption to go up as well, for example, since a limiting factor holding it back is gone.
We’re including three different GK110-based graphics cards in our measurements. Starting from scratch allows for a comparison that’s as objective as possible. We’re using the new GeForce GTX 780 Ti, the Titan, and Gigabyte's GTX 780 WindForce GHz Edition, which might be able to compete with the two other cards thanks to elevated clock rates.
Let’s first take a detailed look at each of the three cards. We’re benchmarking both boards with Nvidia's reference cooler twice: once with default settings and once at 70 °C GPU temperature. The latter necessitates a manual fan speed increase.
GeForce GTX 780 Ti
We start with a look at the frequencies, which might help us explain the somewhat unexpected differences in power consumption later.

Even under full load, the GeForce GTX 780 Ti balances its frequencies well. Consequently, its power consumption is similar in the two scenarios. Nvidia has raised its target temperature target from 80 to 83 °C, which results in a fan RPM that's a little bit higher. Still, the shape of the curve shows how the power consumption decreases once the card backs off of its GPU Boost clock rates.

Things look different when the fan RPM is pushed up. We sought to achieve a 70 °C GPU temperature by setting Nvidia's fan speed to 80% duty cycle, which yields additional performance. We’ll take a closer look at this difference a little later in our efficiency section. For now, here’s a nicely shaped curve:

GeForce GTX Titan
Next up: the former champion. With a temperature target of only 80 °C and a fan that spins only half as fast, the Titan faces an uphill battle. Let’s first take a look at the frequencies:

The difference is almost scary to behold, suggesting the Titan's fan could have probably been pushed a little harder. Aiming for a 70 °C GPU temperature using 80-percent fan speed, GeForce GTX Titan lives up to its name and can even show off its GPU Boost feature a bit. So, what does the card’s power consumption look like after its clock rates are uncorked by pushing a lot of air across its heat sink? First, a look at the stock settings:

Power consumption drops alongside clock rate, which also negatively impacts game performance. Again, we'll evaluate this phenomenon's effect on efficiency shortly.
How about when we dramatically ramp up cooling? GeForce GTX Titan puts its pedal to the medal and pulls quite a bit more power.

This is just a look at power, so all we can tell from these charts is that draw increases by 18 W. Our hope would be that you also get a corresponding performance boost, too. We'll see shortly.
Gigabyte GTX 780 Windforce GHz Edition
The round-up of GK110-based boards is completed by Gigabyte's brand new GTX 780 WindForce GHz Edition. This card features fewer CUDA cores, but they're running at higher clock rates. Is that enough of a compromise to keep a lower-cost, overclocked graphics card competitive? We've seen in the past that GK110’s sweet spot is under 1000 MHz. However, there's also a new stepping of the chip available, and Gigabyte's offering does facilitate a completely consistent frequency, even under load, thanks to its excellent cooler. The card is naturally more expensive than other GTX 780 boards, so the company has to hope it does battle based on elevated clock rates.

Gigabyte's GTX 780 WindForce GHz Edition manages to hold a core frequency of almost 1180 MHz. This is reflected in our power consumption measurements, though.

We see an average power draw of 226 W, putting the Gigabyte card at the same level as our more aggressively-cooled GeForce GTX 780 Ti, and 4 W beyond the 780 Ti's stock configuration.
- GK110, Unleashed: The Wonders Of Tight Binning
- Meet The GeForce GTX 780 Ti
- Test Setup And Benchmarks
- Results: Arma III
- Results: Battlefield 4
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Metro: Last Light
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- Results (DirectX): AutoCAD 2013 And Inventor
- Results (OpenGL): LightWave And Maya 2013
- Results (OpenCL): GPGPU Benchmarks
- Results: CUDA Benchmarks
- Gaming Power Consumption Details
- Detailed Gaming Efficiency Results
- Power Consumption Overview
- Noise And Video Comparison
- Unquestionably The Fastest Single-GPU Graphics Card


It could also come down to production variance between the chips. Seen in before in manufacturing and it's not pretty. Sounds like we're starting to hit the ceiling with these GPUs... Makes me wonder what architectural magic they'll come up with next.
IB
If it has a negligible impact on what it looks like I am wondering how performance is with single cards on ultraHD screens WITHOUT ANTI-ALIASING. Please could you investigate? or point me to somewhere that has. Cheers all!
Apples to apples it looks like the 780 ti will remain faster than the 290x even after we begin to see custom cooling AMD cards . . . but at a high premium.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
Well, it is possible, but highly unlikely, given that Nvidia has a hard defined minimum clock rate, and a much narrower range...plus this is a reference board, it's fully possible that a retail card with a custom cooler will perform much higher (like the Gigabyte 780 in this article).
And, it's not an issue with the Titan or the 780, which are both based on GK110...which has been out for months, and has stable drivers.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
What i can't understand is why this has to be a ****ing war.
When they had no competition, they charged a lot of money for their top end cards. No one was forced to buy these overpriced cards. If no one bought them, they'd drop prices. When AMD released solid competition, they dropped prices.
That's how the market works. If you think AMD wouldn't do the same, well, what can i say..
AMD haven't been in a position to do that for a long time on either the GPU or CPU front, which is why they haven't.
When they tried to release an $800 FX CPU (this is without a monopoly or lead in the market, btw), no one bought it, and AMD had to drop prices by more than half.