Sign in with
Sign up | Sign in

Overclocking: I Want More Than GPU Boost

GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation
By

The implementation of GPU Boost does not preclude overclocking. But because you can’t disable GPU Boost like you might with Intel’s Turbo Boost, you have to operate within the technology’s parameters.

For example, overclocking is now achieved through an offset. You can easily push the base 3D clock up 100, 150, or even 200 MHz. However, if a game was already TDP-constrained at the default clock, it won’t run any faster. In apps that weren’t hitting the GTX 680’s thermal limit before, the offset pushes the performance curve closer to the ceiling.

Because GPU Boost was designed to balance clock rate and voltage with thermal design power in mind, though, overclocking is really made most effective by adjusting the board’s power target upward as well. EVGA’s Precision X tweaking tool includes built-in sliders for both the power target and the GPU clock offset.

Although GeForce GTX 680’s TDP is 195 W, Nvidia says the card’s typical board power is closer to 170 W. So, increasing the power slider actually moves this number higher. At +32%, Precision X’s highest setting is designed to get you right up to the 225 W limit of what two six-pin power connectors and a PCI Express slot are specified to deliver.  

Using Crysis 2 as our very-consistent test case, we can measure the impact of each different alteration and its effect on performance.

First, we launch a single run of the Central Park level at 1920x1080 in DirectX 11 mode, without anti-aliasing. We get a 72.3 FPS result, and we observe GPU Boost pushing the GeForce GTX 680 between 1071 and 1124 MHz during the run (up from the 1006 MHz base).

The top chart shows that we’re bouncing around the upper end of GK104’s power ceiling. So, we increase the target board power by 15%. The result is a small jump to 74.2 FPS, along with clocks that vacillate between 1145 and 1197 MHz.

Figuring the power target boost likely freed up some thermal headroom, we then increase the offset by 100 MHz, which enables even better performance—76.1 FPS. This time, however, we get a constant 1215 MHz. Nvidia says this is basically as fast as the card will go given our workload and the power limit.

So why not up the target power again? At 130% (basically, the interface’s 225 W specification), performance actually drops to 75.6 FPS, and the graph over time shows a constant 1202 MHz. We expected more performance, not less. What gives? This is where folks are going to find a problem with GPU Boost. Because outcome is dependent on factors continually being monitored, performance does change over time. As a GPU heats up, current leakage increases. And as that happens, variables like frequency and voltage are brought down to counter a vicious cycle.

The effect is similar to heat soak in an engine. If you’re on a dynamometer doing back to back pulls, you expect to see a drop in horsepower if you don’t wait long enough between runs. Similarly, it’s easy to get consistently-high numbers after a few minute-long benchmarks. But if you’re gaming for hours, GPU Boost cannot be as effective.

Our attempt to push a 200 MHz offset demonstrates that, even though this technology tries to keep you at the highest frequency under a given power ceiling, increasing both limits still makes it easy to exceed the board’s potential and seize up.

Sliding back a bit to a 150 MHz offset gives us stability, but performance isn’t any better than the 100 MHz setting. No doubt, it’ll take more tinkering to find the right overclock with GPU Boost in the mix and always on.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 327 comments.
This thread is closed for comments
Top Comments
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
Other Comments
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 26 Hide
    johnners2981 , March 22, 2012 12:58 PM
    Damn prices, in europe we have to pay the equivalent of $650-$700 to get one
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
  • 23 Hide
    Anonymous , March 22, 2012 1:00 PM
    Finally we will see prices going down (either way :-) )
  • -4 Hide
    Scotty99 , March 22, 2012 1:03 PM
    Its a midrange card, anyone who disagrees is plain wrong. Thats not to say its a bad card, what happened here is nvidia is so far ahead of AMD in tech that the mid range card purposed to fill the 560ti in the lineup actually competed with AMD's flagship. If you dont believe me that is fine, you will see in a couple months when the actual flagship comes out, the ones with the 384 bit interface.
  • 26 Hide
    Chainzsaw , March 22, 2012 1:04 PM
    Wow not too bad. Looks like the 680 is actually cheaper than the 7970 right now, about 50$, and generally beats the 7970, but obviously not at everything.

    Good going Nvidia...
  • 32 Hide
    SkyWalker1726 , March 22, 2012 1:05 PM
    AMD will certainly Drop the price of the 7xxx series
  • 20 Hide
    rantoc , March 22, 2012 1:13 PM
    2x of thoose ordered and will be delivered tomorrow, will be a nice geeky weekend for sure =)
  • 23 Hide
    Scotty99 , March 22, 2012 1:21 PM
    scrumworksNothing surprising here. Little overclocking can put Tahiti right at the same level. Kepler is actually losing to Tahiti in really demanding games like Metro 2033 that uses the latest tech. Pointless to test ancient and low tech games like World of Warcrap that is ancient, uses dx9 and is not considered cutting edge in any meter.


    Sigh...

    WoW has had DX11 for quite a long time now. Also, go play in a 25 man raid with every detail setting on ultra with 8xAA and 16x AAF and tell me WoW is not taxing on a PC.
  • 16 Hide
    yougotjaked , March 22, 2012 1:21 PM
    Wait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?
  • 0 Hide
    dragonsqrrl , March 22, 2012 1:22 PM
    Just throwing this out there now, but some AMD fanboy will find a way to discredit or marginalize these results.

    ...oh, wait.
  • 24 Hide
    klausey , March 22, 2012 1:24 PM
    Great to see nVidia jumping back into the game and forcing AMD to lower its prices accordingly. I was shocked to see the card actually available at the MSRP of $500 on launch day. I guess we'll see how long that lasts.

    For everyone suggesting that nVidia will release another true "flagship" beyond the 680, I think you are spot on, IF AMD gives them a reason to. There's no reason to push it at the moment as they already hold the crown. If, on the other hand, AMD goes out and makes a 7980, or 79070 SE card with higher clocks (more like what the 7970 can achieve when properly overclocked), I definitely see nVidia stepping their game up a bit.

    Either way, it's awesome to see both AMD and now nVidia taking power consumption into consideration. I'm tired of my computer room feeling like a toaster after an all nighter.
  • 18 Hide
    rantoc , March 22, 2012 1:24 PM
    yougotjakedWait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?


    He means waiting for the GK110, that will be a more of a compute card while this GK104 is more equiped towards gaming.
  • 13 Hide
    EXT64 , March 22, 2012 1:26 PM
    Really disappointing DP compute, but a tradeoff had to be made and this card is meant for gaming, so I can understand their position. Hopefully GK110 is a real card and will eventually come out.
  • -7 Hide
    Anonymous , March 22, 2012 1:27 PM
    but will it run tetris with multiple displays?
  • 7 Hide
    amk-aka-Phantom , March 22, 2012 1:27 PM
    Oh yeah, team green strikes back! :D  Now let's see what 660 Ti will be like, might suggest that to a friend.
Display more comments