Sign in with
Sign up | Sign in

GPU Boost 2.0: Changing A Technology’s Behavior

Nvidia GeForce GTX Titan 6 GB: GK110 On A Gaming Card
By

GPU Boost is Nvidia’s mechanism for adapting the performance of its graphics cards based on the workloads they encounter. As you probably already know, games exact different demands on a GPU’s resources. Historically, clock rates had to be set with the worst-case scenario in mind. But, under “light” loads, performance ended up on the table. GPU Boost changes that by monitoring a number of different variables and adjusting clock rates up or down as the readings allow.

In its first iteration, GPU Boost operated within a defined power target—170 W in the case of Nvidia’s GeForce GTX 680. However, the company’s engineers figured out that they could safely exceed that power level, so long as the graphics processor’s temperature was low enough. Therefore, performance could be further optimized.

Practically, GPU Boost 2.0 is different only in that Nvidia is now speeding up its clock rate based on an 80-degree thermal target, rather than a power ceiling. That means you should see higher frequencies and voltages, up to 80 degrees, and within the fan profile you’re willing to tolerate (setting a higher fan speed pushes temperatures lower, yielding more benefit from GPU Boost). It still reacts within roughly 100 ms, so there’s plenty of room for Nvidia to make this feature more responsive in future implementations.

Of course, thermally-dependent adjustments do complicate performance testing more than the first version of GPU Boost. Anything able to nudge GK110’s temperature up or down alters the chip’s clock rate. It’s consequently difficult to achieve consistency from one benchmark run to the next. In a lab setting, the best you can hope for is a steady ambient temperature.

Vendor-Sanctioned Overvoltage?

When Nvidia creates the specifications for a product, it targets five years of useful life. Choosing clock rates and voltages is a careful process that must take this period into account. Manually overriding a device’s voltage setting typically causes it to run hotter, which adversely effects longevity. As a result, overclocking is a sensitive subject for most companies—it’s standard practice to actively discourage enthusiasts from tuning hardware aggressively. Even if vendors know guys like us ignore those warnings anyway, they’re at least within their right to deny support claims on components that fail prematurely due to overclocking.

Now that GPU Boost 2.0 is tied to thermal readings, the technology can make sure GK110 doesn’t venture up into a condition that’ll hurt it. So, Nvidia now allows limited voltage increases to improve overclocking headroom, though add-in card manufacturers are free to narrow the range as they see fit. Our reference GeForce GTX Titans default to a 1,162 mV maximum, though EVGA’s Precision X software pushed them as high as 1,200 mV. You are asked to acknowledge the increased risk due to electromigration. However, your warranty shouldn’t be voided.

Display all 121 comments.
This thread is closed for comments
Top Comments
  • 27 Hide
    Trull , February 19, 2013 12:19 PM
    Dat price... I don't know what they were thinking, tbh.

    AMD really has a chance now to come strong in 1 month. We'll see.
  • 26 Hide
    tlg , February 19, 2013 12:27 PM
    AMD already said in (a leaked?) teleconference that they will not respond to the TITAN with any card. It's not worth the small market at £1000...
  • 22 Hide
    jaquith , February 19, 2013 12:19 PM
    Hmm...$1K yeah there will be lines. I'm sure it's sweet.

    Better idea, lower all of the prices on the current GTX 600 series by 20%+ and I'd be a happy camper! ;) 

    Crysis 3 broke my SLI GTX 560's and I need new GPU's...
Other Comments
  • 22 Hide
    jaquith , February 19, 2013 12:19 PM
    Hmm...$1K yeah there will be lines. I'm sure it's sweet.

    Better idea, lower all of the prices on the current GTX 600 series by 20%+ and I'd be a happy camper! ;) 

    Crysis 3 broke my SLI GTX 560's and I need new GPU's...
  • 27 Hide
    Trull , February 19, 2013 12:19 PM
    Dat price... I don't know what they were thinking, tbh.

    AMD really has a chance now to come strong in 1 month. We'll see.
  • 3 Hide
    firefyte , February 19, 2013 12:20 PM
    Anyone else having problems with the 7th page?
  • 10 Hide
    tlg , February 19, 2013 12:23 PM
    The high price OBVIOUSLY is related to low yields, if they could get thousands of those on the market at once then they would price it near the gtx680. This is more like a "nVidia collector's edition" model. Also gives nVidia the chance to claim "fastest single gpu on the planet" for some time.
  • 26 Hide
    tlg , February 19, 2013 12:27 PM
    AMD already said in (a leaked?) teleconference that they will not respond to the TITAN with any card. It's not worth the small market at £1000...
  • -8 Hide
    wavebossa , February 19, 2013 12:36 PM
    wavebossa"Twelve 2 Gb packages on the front of the card and 12 on the back add up to 6 GB of GDDR5 memory. The .33 ns Samsung parts are rated for up to 6,000 Mb/s, and Nvidia operates them at 1,502 MHz. On a 384-bit aggregate bus, that’s 288.4 GB/s of bandwidth."12x2 + 12x2 = 6? ..."That card bears a 300 W TDP and consequently requires two eight-pin power leads."Shows a picture of a 6pin and an 8pin...I haven't even gotten past the first page but mistakes like this bug me


    Nevermind, the 2nd mistake wasn't a mistake. That was my own fail reading.
  • 21 Hide
    infernolink , February 19, 2013 12:46 PM
    Titan.. for those who want a Titan e-peen
  • 5 Hide
    ilysaml , February 19, 2013 12:47 PM
    Quote:
    The Titan isn’t worth $600 more than a Radeon HD 7970 GHz Edition. Two of AMD’s cards are going to be faster and cost less.

    My understanding from this is that Titan is just 40-50% faster than HD 7970 GHz Ed that doesn't justify the Extra $1K.
  • 15 Hide
    battlecrymoderngearsolid , February 19, 2013 1:11 PM
    Can't it match GTX 670s in SLI? If yes, then I am sold on this card.

    What? Electricity is not cheap in the Philippines.
  • 14 Hide
    Fulgurant , February 19, 2013 1:20 PM
    Ninjawithagun$1000 per Titan card is a bit hard for most mid-range gamers and even high-end gamers to afford.


    Titan is a luxury product. It's not supposed to offer a competitive price/performance ratio, just as a Ferrari's price isn't based on its horsepower or fuel efficiency. Titan is a statement moreso than it is a bona-fide money maker for nVidia.

    The idea of status-symbol computer components strikes me as a little silly, of course, but I'm not in the target market. Neither are most gamers, whether high end or not.

    If you generally spend $1600 on the graphics' subsystem of your computer, then I'm not even sure you fit in the so-called high-end. Super-high-end, maybe. You are the 1%. :) 
  • 2 Hide
    azraa , February 19, 2013 1:22 PM
    Waaaay too much hype :/ 
    Its an engineering beauty, but what could make us wish it? Most gamers already have enough with 7970Ghz or 670s so... not a smart choice.
  • 19 Hide
    mindless728 , February 19, 2013 1:23 PM
    Quote:
    "Twelve 2 Gb packages on the front of the card and 12 on the back add up to 6 GB of GDDR5 memory. The .33 ns Samsung parts are rated for up to 6,000 Mb/s, and Nvidia operates them at 1,502 MHz. On a 384-bit aggregate bus, that’s 288.4 GB/s of bandwidth."

    12x2 + 12x2 = 6? ...


    the chips are Gb (Gigabit) not GB (Gigabyte) which is a difference of 8x

    so 12x2Gb+12x2Gb = 48 Gb = 6GB

    chips are commonly refereed to in capacity as the bit size not byte size
  • 13 Hide
    oxiide , February 19, 2013 1:23 PM
    wavebossa12x2 + 12x2 = 6?

    Assuming proper notation is being observed (often its not), "b" is a bit and "B" is a byte.

    6 Gigabytes = 48 Gigabits as 1 Byte = 8 bits.
  • 9 Hide
    renz496 , February 19, 2013 1:31 PM
    to me this thing is in the same league as Asus ARES II. both product are not something you discuss about price/performance.

    btw very interested how far this 'beast' will overclock
  • 8 Hide
    bl1nds1de13 , February 19, 2013 1:41 PM
    When compared to the GTX690 I would have to differ on saying that " there's no real reason not to favor it over Titan " ..... Any SLI or crossfire solution, including dual board cards like the 690, will have microstutters when compared to a single card setup. This has been thoroughly shown in several tests, and have seen it myself. A single card will never have scaling issues or microstutters.

    BL1NDS1DE13
  • 2 Hide
    Au_equus , February 19, 2013 1:45 PM
    Quote:
    Unfortunately, Nvidia says the 690’s magnesium alloy fan housing was too expensive...
    o.O and $1000 is cheap? The 690 sold for around the same price and nothing was said then. Can they come up with a better excuse? Idk, like aliens stole our magnesium... smh.
  • 11 Hide
    hero1 , February 19, 2013 1:48 PM
    Pass! I don't think it's worth forking out $2000 for 2 of these cards no matter how good or rare or awesome they are. $2000 gets you a nice i7 rig with 2x AMD Radeon 7970 GHz Ed./ GTX 680 SLI that are more than capable of handling anything you throw at them. Time to go ahead and place that order for GHz cards. Nice to see your face Titan but you ain't selling at the price we are looking for!
  • 2 Hide
    mayankleoboy1 , February 19, 2013 2:01 PM
    if i really want ultra performance, i would get 2xHD7970. Would get better gaming and compute performance.
  • 3 Hide
    aberkae , February 19, 2013 2:02 PM
    I cant wait for the review
Display more comments