Graphics Overclocking: Getting The Most From Your GPU

Conclusion: It’s A Tie

ATI’s approach of offering an integrated overclocking feature (ATI Overdrive) right in the driver is a clever one, allowing users to squeeze additional performance out of their cards. As long as the fan speed profile isn’t completely off and the temperature thresholds are recognized correctly, driver-side overclocking should be sufficient. If you want to go beyond that, tools like RivaTuner and the ATI Tray Tools are for you. Offering practical features such as creating your own temperature-triggered fan speed profiles and a wider range of frequencies for overclocking, they will help you get the most out of your card’s GPU and memory.

Nvidia, on the other hand, doesn’t grant users the option to overclock its cards directly through the driver (you'll need to download the company's System Tools add-on). Instead, several card makers such as EVGA, MSI, Zotac, Asus, and Gainward program higher clock speeds right into their cards’ BIOS. If you own a card that runs at Nvidia’s reference speeds, we recommend taking a look at EVGA’s Precision tool, since RivaTuner no longer works with Nvidia’s current crop of reference drivers in its current 2.24 release.

The results of our individual tests underscore the impact a current driver has on performance. Indeed, upgrading to the newest driver allowed our candidates to turn out better performance numbers than when they were overclocked using their manufacturer’s modified (but outdated) versions, even at stock speeds! We keep mentioning this because, while a few board vendors do include some rather sophisticated O/C tools with their cards, these are often built into outdated driver releases, wasting their product’s potential where newer games are concerned. Even worse is that some features may be unavailable as a result. For example, MSI’s modified driver 8.542 wouldn’t let us run Far Cry 2 at 1680 x 1050.

When we compare our candidates at their respective stock speeds, ATI scores 398.3 total frames per second, while Nvidia comes in at 378.9 fps. Overclocking both cards to their very limit bumps the scores to 434.2 fps for ATI and 436.8 fps on Nvidia’s side. That means that overclocking the Radeon HD 4870 gives it a performance boost of 9 percent overall, while the GeForce GTX 260 sees a 15.3 percent performance improvement. If you go by the numbers, the GTX offers slightly more overclocking potential. Additionally, it has 1,792 MB of video RAM. On the other hand, it takes more effort to get those results out of the card, since you have to increase the voltage for both the GPU and the memory. And the twin fans run full blast when the card is overclocked to its limit, making it very loud.

In the end, what we have here is really a tie. Even when we look at efficiency measured in frames per watt, the scores are basically identical, coming in at 1.487 for ATI and 1.486 for Nvidia. In absolute terms, the ATI platform draws 292 watts compared to Nvidia’s 294 watts. Thus, whichever card you pick to overclock comes down to personal preference or, perhaps, whether you prefer CUDA or Stream.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
34 comments
    Your comment
    Top Comments
  • dingumf
    What the hell, I thought this was a guide to overclocking the GPU as the title reads "Graphics Overclocking: Getting The Most From Your GPU"

    Then at the end Tom's Hardware screws me over and writes "Conclusion: It’s A Tie"

    Isn't this a tutorial?
    20
  • Other Comments
  • dingumf
    What the hell, I thought this was a guide to overclocking the GPU as the title reads "Graphics Overclocking: Getting The Most From Your GPU"

    Then at the end Tom's Hardware screws me over and writes "Conclusion: It’s A Tie"

    Isn't this a tutorial?
    20
  • Anonymous
    they tell you how to overclock using CCC or riva tuner, or evga precision, they also tell you, overclocking = more performance at the cost of more power. what else do you want?
    9
  • dingumf
    joeman42What is really needed is a "continuous" OC utility that can detect artifacts during actual use and adjust accordingly. I've noticed that my max OC tends to change each time I test and depending on the tool I test with (e.g., atitool, gputool, rivatuner, and my favorite, atitraytool). Some games, l4d in particular, crash at the slightest error. Others such as COD and Deadspace are somewhat tolerant. Games like Far Cry 2 and Fear 2 don't seem to care at all. It would be nice if the utility could take this into account.As for the tools themselves, Atitraytool has far and away the best fan speed adjuster, the dual ladder Temp/Speed is a model of simplicity. Plus, it can automatically sense a game and auto OC just for the duration. Nothing like this exists on the NV side (you must explicitly specify each exe). Unfortunately, I am on a NVidia card now and Rivatuner is pretty much the only game in town for serious tweaking. IT IS A DESIGN DISASTER! random design with no discernable structure. A help file which consist solely of the author bragging about his creation, without explanation as to where each feature is implemented or how to use it. And no, scattered tooltips is not an acceptable alternative. It took forever to figure out that I needed to create a fan profile and then a macro and then create a rules to fire the macro which contains the fan profile just to set one(!) fan speed/temp point (and repeat as needed). Sorry for the rant, but I really hate Rivatuner!


    Oh hello. That's what OCCT is for.
    -2
  • nitrium
    Rivatuner works just fine with the latest drivers (incl. 190.38). Just check the Power User tab and under System set Force Driver Version to 19038 (or in the articles case 18618) - no decimal point. Be sure that the hexidecimal display at the bottom is unchecked. All Rivatuner's usual features can now be accessed.
    -1
  • masterjaw
    I don't think this is intended to be an in-depth tutorial like dingumf perceives. It's just for people to realize that they could still get more from their GPUs using tools.

    On the other hand, I don't like the sound of "It's a tie". It looks like it is said just to show neutrality. ATI or Nvidia? It doesn't matter, as long as your satisfied with it.
    0
  • quantumrand
    I must say, the HD 2900 is a great card. I picked up the 2900 Pro for $250 back in 2007 and flashed the bios to a modified XT bios with slightly higher clocks (850/1000). The memory is only GDDR3, but with the 512bit interface, it really does rival the bandwidth of the 4870. I can get it to run Crysis at Very High, 1440x900 with moderately playable framerates (about 25fps, but the motion blur makes it seem quite smooth). Really quite amazing for any 2007 card, let alone one for $250.
    1
  • quantumrand
    Just a bit of extra info on the 2900 Pro...

    The Pros were bassically binned XTs once ATI realized that the card was too difficult to manufacture cheaply (something about the high layer count it takes to make a 512bit PCB), so in order to sell their excess cores, the clocked them lower and branded them as Pros. Additionally, they changed the heatsink specs as well, adding an extra heatpipe. Because of this, the Pros could often OC higher than the XTs, making them essentially the best deal on the market (assuming you got a decent core).
    3
  • Ramar
    Overclocking a GPU generally isn't worth it IMO, but sometimes can give that extra push into +60fps average. Or to make yourself feel better about a purchase like myself; one week after I bought a 9800GTX they came out with the GTX+. A little tweaking in EVGA Precision brought an impressive 10% overclock up to GTX+ levels and left me satisfied.
    3
  • manitoublack
    I run 2 Palit GTX295's and have had great success with palit's "Vtune" over clocking software. I believe it works with cards from other vendors as well. Easy to use and driver independent.

    Cheers for the great article
    Jordan
    -1
  • KT_WASP
    Quote:
    Conclusion: It’s A Tie


    I didn't know they were in competition until I read that.....I too thought this was about overclocking a GPU in general, not which card you should buy. Once again Toms throws that little barb at the end to stoke the fires.

    I think they do this constantly to get more website hits.. If the can get a good ol' fanboy war on every article, they will get people coming back over and over again to add fuel to the fire. After all, the more hits they get, the more they get paid from their sponsors. Which BTW, seam to be taking up more real estate then actual content on this site these days.
    2
  • kayvonjoon
    are u serious, man??
    I have 730 core clock on my evga gtx 260, 24/7.
    -1
  • amnotanoobie
    RamarOverclocking a GPU generally isn't worth it IMO, but sometimes can give that extra push into +60fps average.


    I think it becomes important when the game you're playing hovers at 24+ fps, that minor/major OC might make your game from stutter hell to just about playable. I had a game before that I was able to set to mid-high settings but the fps hovered at 25fps, the minor OC helped me push 30fps+.

    At that time that was the only game I was playing and spending money to get a vid card to play this one game imho wasn't worth it. The viable cards that I could buy at that time were 8600GT(this thing is a joke), 8800GTS 320MB(too expensive back then), 2600XT(a worse joke), 2900(a heat monster).
    -1
  • unknown_2
    I think they are not using the the ATT correctly.

    As you can see here, I underclock my 4870 to 200/250 with just 0.5v

    -1
  • unknown_2
    sorry, copied the wrong link...
    here it is.......

    http://i474.photobucket.com/albums/rr107/agent47_1/att.jpg
    -1
  • wh3resmycar
    Quote:
    Back in its day, ATI’s Radeon HD 2900 XT was a very fast card.


    the author of the article had to be joking.
    -5
  • IzzyCraft
    Against the 8800GT it was very fast, fast like a rocket(8800) vs a turtle(2900)
    -5
  • KT_WASP
    wh3resmycarthe author of the article had to be joking.


    The HD 2900XT was a fast card. Not as fast as Nvidia's flag ship 8800 line-up, but not to far off. They were hot and power hungry, but not slow.

    I think that most people remember the disappointment of the real-world performance of the HD 2900 series after all the hype and projected performance the card had on paper at that time. Going by the specs it looked to be a 8800 killer, but then it came to market and did no such thing. This is what I think people remember the HD 2900 series most by.

    But, if you look past that, it was still a fast card. At that time it was beating CrossFired X1950XTX's and SLI'ed 79050 GT's, and even 7950GX2's in SLI.

    So yes, in its day it was a fast card. Was it the fastest? No, but that doesn't mean it was a slow card by any means.

    I think that people need to remember that at that time Nvidia broke all the molds with their 8800 series. It is a credit to Nvidia that they were able to bring performance levels to a new high. But, don't disregard the 2900 series as slow cards.
    5
  • unknown_2
    2900xt was a fast card. Just that it can a little too late. And during that time, nVidia 8800 line up was really really rocketing.

    I still consider nVvdia 8800GTX as one of the finest, yet most expensive card that I nvr own. 8800GTX is legendary. Even by todays standard, it still rocks.
    -1
  • bildo123
    I have a XFX 4870 1GB and using ATI Tray Tools DOES let me underclock AND undervolt my card just fine. In fact I use the awesome auto-overclock feature. Whenever it detects that a game is opened it switches to a profile of your choice, my OC, being 855/1000. Also, if you have an old game, like Diablo 2, that doesn't need a full OC, you can add EXE's to the exception list say it stays on the 'idle' profile. I don't know off the top of my head exactly but at idle, its around 350/450 and the voltage is a tiny pinch above 1v, around 1.083 or something like that. According to GPU-Z the core draws 2.5x less amperage 11A compared to 25.5.
    0