GPU Boost 2.0 And Troubleshooting Overclocking
GPU Boost 2.0
I didn’t have a chance to do a ton of testing with Nvidia’s second-generation GPU Boost technology in my GeForce GTX Titan story, but the same capabilities carry over to GeForce GTX 780. Here’s the breakdown:
GPU Boost is Nvidia’s mechanism for adapting the performance of its graphics cards based on the workloads they encounter. As you probably already know, games exact different demands on a GPU’s resources. Historically, clock rates had to be set with the worst-case scenario in mind. But, under “light” loads, performance ended up on the table. GPU Boost changes that by monitoring a number of different variables and adjusting clock rates up or down as the readings allow.
In its first iteration, GPU Boost operated within a defined power target—170 W in the case of Nvidia’s GeForce GTX 680. However, the company’s engineers figured out that they could safely exceed that power level, so long as the graphics processor’s temperature was low enough. Therefore, performance could be further optimized.
Practically, GPU Boost 2.0 is different only in that Nvidia is now speeding up its clock rate based on an 80-degree thermal target, rather than a power ceiling. That means you should see higher frequencies and voltages, up to 80 degrees, and within the fan profile you’re willing to tolerate (setting a higher fan speed pushes temperatures lower, yielding more benefit from GPU Boost). It still reacts within roughly 100 ms, so there’s plenty of room for Nvidia to make this feature more responsive in future implementations.
Of course, thermally-dependent adjustments do complicate performance testing more than the first version of GPU Boost. Anything able to nudge GK110’s temperature up or down alters the chip’s clock rate. It’s consequently difficult to achieve consistency from one benchmark run to the next. In a lab setting, the best you can hope for is a steady ambient temperature.
In addition to what I wrote for Titan, it should be noted that you can adjust the thermal target higher. So, for example, if you want GeForce GTX 780 to modulate clock rate and voltage based on an 85- or 90-degree ceiling, that’s a configurable setting.
Eager to keep GK110 as far away from your upper bound as possible? The 780’s fan curve is completely adjustable, allowing you to specify duty cycle over temperature.
Troubleshooting Overclocking
Back when Nvidia briefed me on GeForce GTX Titan, company reps showed me an internal tool able to read the status of various sensors, which made it possible to diagnose problematic behavior. If an overclock was pushing GK110’s temperature too high, causing a throttle response, it’d log that information.
The company now enables that functionality in apps like Precision X, triggering a “reasons” flag when certain boundaries are crossed, preventing an effective overclock. This is very cool; you’re no longer left guessing about bottlenecks. Also, there’s an OV max limit readout that lets you know if you’re pushing the GPU’s absolute peak voltage. If this flag pops, Nvidia says you risk frying your card. Consider that a good place to back off your overclocking effort.