Skip to main content

Graphics Overclocking: Getting The Most From Your GPU

Benchmarks: Nvidia And D.O.T.

Overclocking using the D.O.T feature only works with MSI’s modified driver based on the older 182.06 release. In this test, we were interested in seeing what kind of benefits driver-level overclocking offer. To this end, we used the card’s factory-overclocked default setting (655/1,404 MHz) as well as the Sergeant and Commander levels of the D.O.T. software. To see what kind of an impact CPU overclocking would have, we also ran all tests with our Core i7-920 once at 2.67 GHz and once at 3.8 GHz.

MSI N260GTX Lightning
O/C ModeClock Speeds (GPU/Shaders/Memory)Watts (3D)Temperature (3D)dB(A) 3D
Standard (default)655/1,404/99927767.043.0
MSI D.O.T Sergeant681/1,460/1,03828262.044.3
MSI D.O.T Captain694/1,488/1,05828559.047.3

One thing to bear in mind when looking at the minimum and maximum frame rates is that they represent the absolute peaks and troughs, and may last only as long as a single frame. The extra performance provided by the overclocked CPU actually gives our Nvidia card a greater boost, namely up to 5.2 fps (or 9 percent) without AA. This time around, we can’t actually compensate for a slower CPU by overclocking the graphics card in Far Cry 2. Our Nvidia card topped out at the “Captain” level of the D.O.T. overclocking scale, which isn’t too shabby considering it comes factory-overclocked already.

Average and minimum frame rates improve by between 1 and 2.5 fps. Overclocking raises the minimum frame rate at 1920x1200 with 8xAA and 16xAF to 29.5 fps, which should minimize the effect of performance dips. It’s interesting to note the comparatively small difference between the minimum and average frame rates.

Far Cry 2, GeForce GTX 260 (GPU/Shader/Memory)1920x1200 0xAA 0xAF1920x1200 4xAA 8xAF1920x1200 8xAA 16xAF
O/C Mode/Clock Speeds/CPU/DriverMinAverageMaxMinAverageMaxMinAverageMax
Standard, 655/1,404/999, i7@2.67, 182.0642.955.588.534.944.361.127.334.048.6
Standard, 655/1,404/999, i7@3.8, 182.0648.560.180.137.546.764.127.735.252.1
D.O.T. Sergeant, 681/1,460/1,038, i7@2.67, 182.0643.956.877.835.645.962.728.235.350.5
D.O.T. Sergeant, 681/1,460/1,038, i7@3.8, 182.0649.662.781.738.848.466.528.836.754.0
D.O.T. Captain, 694/1,488/1,058, i7@2.67, 182.0644.557.787.736.746.370.827.535.556.7
D.O.T. Captain, 694/1,488/1,058, i7@3.8, 182.0650.462.383.338.548.766.829.536.953.8

The highest gains in Left 4 Dead came in at 5.2 fps. The card remained completely stable at these settings and allowed the benchmark to complete without incident.

Left 4 Dead, GeForce GTX 260 (GPU/Shader/Memory)1920x1200 0xAA 0xAF1920x1200 4xAA 8xAF1920X1200 8xAA 16xAF
O/C Mode/Clock Speeds/CPU/DriverAverageAverageAverage
Standard, 655/1,404/999, i7@2.67, 182.06108.783.068.3
Standard, 655/1,404/999, i7@3.8, 182.06110.284.268.9
D.O.T. Sergeant, 681/1,460/1,038, i7@2.67, 182.06112.186.971.7
D.O.T. Sergeant, 681/1,460/1,038, i7@3.8, 182.06115.088.271.6
D.O.T. Captain, 694/1,488/1,058, i7@2.67, 182.06113.188.371.7
D.O.T. Captain, 694/1,488/1,058, i7@3.8, 182.06115.489.572.7

MSI’s fan speed profile reacts to overclocking and audibly spins up the cooler when higher, more demanding D.O.T. settings are selected. Nevertheless, the performance gains are rather small. To corroborate this, we ran another quick benchmark at 1280 x 1024 and witnessed similarly small performance gains there as well. The low overall results may stem from the high quality settings we used for this benchmark. Reducing the image quality to attain higher frame rates doesn’t make much sense, though, since these results already represent playable frame rates.

Far Cry 2, GeForce GTX 260 (GPU/Shader/Memory)1280x1024 0xAA 0xAF1280x1024 8xAA 16xAF
O/C Mode/Clock Speeds/CPU/DriverMinAverageMaxMinAverageMax
Standard, 655/1,404/999, i7@3.8, 182.0659.874.4100.739.051.070.9
D.O.T. Sergeant, 681/1,460/1,038, i7@3.8,182.0660.676.6118.141.452.772.9
D.O.T. Captain, 694/1,488/1,058, i7@3.8, 182.0661.277.0123.341.453.073.5
  • dingumf
    What the hell, I thought this was a guide to overclocking the GPU as the title reads "Graphics Overclocking: Getting The Most From Your GPU"

    Then at the end Tom's Hardware screws me over and writes "Conclusion: It’s A Tie"

    Isn't this a tutorial?
  • they tell you how to overclock using CCC or riva tuner, or evga precision, they also tell you, overclocking = more performance at the cost of more power. what else do you want?
  • dingumf
    joeman42What is really needed is a "continuous" OC utility that can detect artifacts during actual use and adjust accordingly. I've noticed that my max OC tends to change each time I test and depending on the tool I test with (e.g., atitool, gputool, rivatuner, and my favorite, atitraytool). Some games, l4d in particular, crash at the slightest error. Others such as COD and Deadspace are somewhat tolerant. Games like Far Cry 2 and Fear 2 don't seem to care at all. It would be nice if the utility could take this into account.As for the tools themselves, Atitraytool has far and away the best fan speed adjuster, the dual ladder Temp/Speed is a model of simplicity. Plus, it can automatically sense a game and auto OC just for the duration. Nothing like this exists on the NV side (you must explicitly specify each exe). Unfortunately, I am on a NVidia card now and Rivatuner is pretty much the only game in town for serious tweaking. IT IS A DESIGN DISASTER! random design with no discernable structure. A help file which consist solely of the author bragging about his creation, without explanation as to where each feature is implemented or how to use it. And no, scattered tooltips is not an acceptable alternative. It took forever to figure out that I needed to create a fan profile and then a macro and then create a rules to fire the macro which contains the fan profile just to set one(!) fan speed/temp point (and repeat as needed). Sorry for the rant, but I really hate Rivatuner!
    Oh hello. That's what OCCT is for.
  • nitrium
    Rivatuner works just fine with the latest drivers (incl. 190.38). Just check the Power User tab and under System set Force Driver Version to 19038 (or in the articles case 18618) - no decimal point. Be sure that the hexidecimal display at the bottom is unchecked. All Rivatuner's usual features can now be accessed.
  • masterjaw
    I don't think this is intended to be an in-depth tutorial like dingumf perceives. It's just for people to realize that they could still get more from their GPUs using tools.

    On the other hand, I don't like the sound of "It's a tie". It looks like it is said just to show neutrality. ATI or Nvidia? It doesn't matter, as long as your satisfied with it.
  • quantumrand
    I must say, the HD 2900 is a great card. I picked up the 2900 Pro for $250 back in 2007 and flashed the bios to a modified XT bios with slightly higher clocks (850/1000). The memory is only GDDR3, but with the 512bit interface, it really does rival the bandwidth of the 4870. I can get it to run Crysis at Very High, 1440x900 with moderately playable framerates (about 25fps, but the motion blur makes it seem quite smooth). Really quite amazing for any 2007 card, let alone one for $250.
  • quantumrand
    Just a bit of extra info on the 2900 Pro...

    The Pros were bassically binned XTs once ATI realized that the card was too difficult to manufacture cheaply (something about the high layer count it takes to make a 512bit PCB), so in order to sell their excess cores, the clocked them lower and branded them as Pros. Additionally, they changed the heatsink specs as well, adding an extra heatpipe. Because of this, the Pros could often OC higher than the XTs, making them essentially the best deal on the market (assuming you got a decent core).
  • Ramar
    Overclocking a GPU generally isn't worth it IMO, but sometimes can give that extra push into +60fps average. Or to make yourself feel better about a purchase like myself; one week after I bought a 9800GTX they came out with the GTX+. A little tweaking in EVGA Precision brought an impressive 10% overclock up to GTX+ levels and left me satisfied.
  • manitoublack
    I run 2 Palit GTX295's and have had great success with palit's "Vtune" over clocking software. I believe it works with cards from other vendors as well. Easy to use and driver independent.

    Cheers for the great article
    Conclusion: It’s A Tie

    I didn't know they were in competition until I read that.....I too thought this was about overclocking a GPU in general, not which card you should buy. Once again Toms throws that little barb at the end to stoke the fires.

    I think they do this constantly to get more website hits.. If the can get a good ol' fanboy war on every article, they will get people coming back over and over again to add fuel to the fire. After all, the more hits they get, the more they get paid from their sponsors. Which BTW, seam to be taking up more real estate then actual content on this site these days.