Sign in with
Sign up | Sign in

Keeping Cool (Enough)

Graphics Overclocking: Getting The Most From Your GPU
By

Cooling on MSI’s GTX 260Cooling on MSI’s GTX 260

Back in its day, ATI’s Radeon HD 2900 XT was a very fast card. It also happened to be a very loud one. ATI took another approach with the Radeon HD 4870, setting its fan to spin only very slowly in 2D mode, making the card much quieter.

The downside was that the GPU tended to idle at temperatures of around 80 degrees Celsius, while also heating up other components inside the computer case. Although ATI’s reference cooler design features an air duct to channel the warm air out of the case, the fan’s low speed meant that there wasn’t actually enough airflow to do this job. The resulting higher ambient temperature meant that the CPU, hard drive, memory, and other components were running a good deal warmer, especially in cramped or small towers with little airflow. Using a CrossFire setup with two cards directly adjacent to one another is practically impossible without another fan blowing onto the cards from the side, since they’ll be so close together that one of them will literally have no space to breathe.

Palit R700 HD 4870 X2Palit R700 HD 4870 X2

There has also been speculation that motherboard components which are closer to heat sources may suffer an increased failure rate. Since the reference coolers are very mature designs, thanks in great part to their use of heat pipes, and make very efficient use of the space at their disposal, there aren’t really a lot of options. MSI has opted to increase the stock fan speed even in 2D mode. If you own a card made by another company, you’d usually either have to simply put up with the higher temperatures or tinker with the fan speed profile in the card’s BIOS--at your own risk, obviously.

While they do have the advantage of getting a better handle on heat output, especially under full load, MSI’s fan speed profiles also result in a lot of noise. The dual-GPU Radeon HD 4870 X2 featured similar fan profiles, using the same formula: higher fan speeds for better cooling at the cost of higher noise output. However, this card was also plagued by stability issues. Depending on the manufacturer and the driver version, the card could produce crashes or system freezes when the graphics chips reached a temperature of 85 to 96 degrees Celsius. Palit sought to remedy that problem by equipping its Revolution 700 with a monster cooler that makes the card one of the few triple-slot add-in boards. On the upside, the cooler’s twin 80 mm fans ensure that both cooling and noise are no longer an issue.

The cooler on MSI’s HD 4870The cooler on MSI’s HD 4870A very welcome side effect of the increased fan speeds is that they tend to allow higher clock speeds as well, which is why MSI calls its modified versions “OC Editions.” Of course, other companies also offer overclocked models with improved coolers. Indeed, the factory overclocked models are often used for marketing purposes, the implication being that they offer better quality in addition to higher performance, justifying their heftier price tag. For example, Gainward offers “Golden Samples” for a variety of models, while Asus improves on the standard models’ performance with its Smart Doctor tool. PowerColor equips its ATI-based cards with improved coolers and then raises their clock speeds, designating these tweaked variants by adding the letters PCS to their model number. Zotac takes a different tack, dialing up the frequencies while retaining the reference coolers, calling these models AMP-Editions. EVGA calls such versions “Superclocked” and specializes in water-cooled dual GPU cards with increased clock speeds. Sapphire’s cards use their own coolers, dubbed Vapor-X, while HIS adopted Arctic Cooling’s designs for their IceQ series of ATI cards with enhanced frequencies.

HIS HD 4670 IceQ air with exhaust ventHIS HD 4670 IceQ air with exhaust ventSeveral factors can affect a card’s cooling performance. An air duct that funnels the cooler's warm air out the back of the case is certainly a plus, and most dual-slot reference designs employ this type of design. Aftermarket solutions, on the other hand, tend to re-circulate the cooler’s exhaust back into the case. Their biggest upside is that they have a much larger cooling surface and usually employ heatpipes, offering the GPU optimal thermal performance (especially if there is good air flow through the case). The major drawback is that their exhaust air heats up the other components in your chassis, making them less effective as the ambient air warms up. Even a heatsink armed with dual 80 mm fans will have to start spinning up to audible levels if it’s starved of fresh, cool air.

Memory CoolingMemory CoolingMSI’s models demonstrate both the advantages and the downsides of such altered cooling solutions. The ATI model built around the Radeon HD 4870 runs much cooler than the reference design and quieter than the versions using MSI’s fan speed profile. Its heatsink employs two large heatpipes to transfer the heat to a multitude of cooling fins, and the cooler’s 85 mm fan is nearly silent in 2D mode. Its main disadvantages are that it doesn’t cool the card’s memory modules and doesn’t expel its warm air out of the case, raising the temperature inside.

The Nvidia model using the GeForce GTX 260 doesn’t offer any improvements when the card operates at default settings. Once you overclock it, though, the dual fan solution begins to show its true potential. While the stock cooler hits a wall at a certain point, the Superpipe cooler can still spin its fans a little faster yet. It also helps that it is able to distribute its heat across five large heatpipes. This design already proved its worth on the GTX 285, and the GTX 260 Lightning adopts it to very good effect. It even cools the memory modules.

Cooler Comparison
Temperature 2D
(Degrees Celsius)
dB(A) 2D
Temperature 3D
(Degrees Celsius)
dB(A) 3D
MSI R4870-MD1G
46
36.0
70
40.6
MSI R4870-T2D512
60
38.0
74
49.4
MSI N260GTX Lightning
43
36.5
67
43.0
Zotac GTX 260 216SPs
45
37.5
81
41.2

Display all 34 comments.
This thread is closed for comments
Top Comments
  • 20 Hide
    dingumf , July 27, 2009 6:20 AM
    What the hell, I thought this was a guide to overclocking the GPU as the title reads "Graphics Overclocking: Getting The Most From Your GPU"

    Then at the end Tom's Hardware screws me over and writes "Conclusion: It’s A Tie"

    Isn't this a tutorial?
Other Comments
  • 20 Hide
    dingumf , July 27, 2009 6:20 AM
    What the hell, I thought this was a guide to overclocking the GPU as the title reads "Graphics Overclocking: Getting The Most From Your GPU"

    Then at the end Tom's Hardware screws me over and writes "Conclusion: It’s A Tie"

    Isn't this a tutorial?
  • 9 Hide
    Anonymous , July 27, 2009 6:32 AM
    they tell you how to overclock using CCC or riva tuner, or evga precision, they also tell you, overclocking = more performance at the cost of more power. what else do you want?
  • -2 Hide
    dingumf , July 27, 2009 6:54 AM
    joeman42What is really needed is a "continuous" OC utility that can detect artifacts during actual use and adjust accordingly. I've noticed that my max OC tends to change each time I test and depending on the tool I test with (e.g., atitool, gputool, rivatuner, and my favorite, atitraytool). Some games, l4d in particular, crash at the slightest error. Others such as COD and Deadspace are somewhat tolerant. Games like Far Cry 2 and Fear 2 don't seem to care at all. It would be nice if the utility could take this into account.As for the tools themselves, Atitraytool has far and away the best fan speed adjuster, the dual ladder Temp/Speed is a model of simplicity. Plus, it can automatically sense a game and auto OC just for the duration. Nothing like this exists on the NV side (you must explicitly specify each exe). Unfortunately, I am on a NVidia card now and Rivatuner is pretty much the only game in town for serious tweaking. IT IS A DESIGN DISASTER! random design with no discernable structure. A help file which consist solely of the author bragging about his creation, without explanation as to where each feature is implemented or how to use it. And no, scattered tooltips is not an acceptable alternative. It took forever to figure out that I needed to create a fan profile and then a macro and then create a rules to fire the macro which contains the fan profile just to set one(!) fan speed/temp point (and repeat as needed). Sorry for the rant, but I really hate Rivatuner!


    Oh hello. That's what OCCT is for.
  • -1 Hide
    nitrium , July 27, 2009 7:12 AM
    Rivatuner works just fine with the latest drivers (incl. 190.38). Just check the Power User tab and under System set Force Driver Version to 19038 (or in the articles case 18618) - no decimal point. Be sure that the hexidecimal display at the bottom is unchecked. All Rivatuner's usual features can now be accessed.
  • 0 Hide
    masterjaw , July 27, 2009 8:08 AM
    I don't think this is intended to be an in-depth tutorial like dingumf perceives. It's just for people to realize that they could still get more from their GPUs using tools.

    On the other hand, I don't like the sound of "It's a tie". It looks like it is said just to show neutrality. ATI or Nvidia? It doesn't matter, as long as your satisfied with it.
  • 1 Hide
    quantumrand , July 27, 2009 8:43 AM
    I must say, the HD 2900 is a great card. I picked up the 2900 Pro for $250 back in 2007 and flashed the bios to a modified XT bios with slightly higher clocks (850/1000). The memory is only GDDR3, but with the 512bit interface, it really does rival the bandwidth of the 4870. I can get it to run Crysis at Very High, 1440x900 with moderately playable framerates (about 25fps, but the motion blur makes it seem quite smooth). Really quite amazing for any 2007 card, let alone one for $250.
  • 3 Hide
    quantumrand , July 27, 2009 8:46 AM
    Just a bit of extra info on the 2900 Pro...

    The Pros were bassically binned XTs once ATI realized that the card was too difficult to manufacture cheaply (something about the high layer count it takes to make a 512bit PCB), so in order to sell their excess cores, the clocked them lower and branded them as Pros. Additionally, they changed the heatsink specs as well, adding an extra heatpipe. Because of this, the Pros could often OC higher than the XTs, making them essentially the best deal on the market (assuming you got a decent core).
  • 3 Hide
    Ramar , July 27, 2009 9:05 AM
    Overclocking a GPU generally isn't worth it IMO, but sometimes can give that extra push into +60fps average. Or to make yourself feel better about a purchase like myself; one week after I bought a 9800GTX they came out with the GTX+. A little tweaking in EVGA Precision brought an impressive 10% overclock up to GTX+ levels and left me satisfied.
  • -1 Hide
    manitoublack , July 27, 2009 9:34 AM
    I run 2 Palit GTX295's and have had great success with palit's "Vtune" over clocking software. I believe it works with cards from other vendors as well. Easy to use and driver independent.

    Cheers for the great article
    Jordan
  • 2 Hide
    KT_WASP , July 27, 2009 10:07 AM
    Quote:
    Conclusion: It’s A Tie


    I didn't know they were in competition until I read that.....I too thought this was about overclocking a GPU in general, not which card you should buy. Once again Toms throws that little barb at the end to stoke the fires.

    I think they do this constantly to get more website hits.. If the can get a good ol' fanboy war on every article, they will get people coming back over and over again to add fuel to the fire. After all, the more hits they get, the more they get paid from their sponsors. Which BTW, seam to be taking up more real estate then actual content on this site these days.

  • -1 Hide
    kayvonjoon , July 27, 2009 10:16 AM
    are u serious, man??
    I have 730 core clock on my evga gtx 260, 24/7.
  • -1 Hide
    amnotanoobie , July 27, 2009 10:37 AM
    RamarOverclocking a GPU generally isn't worth it IMO, but sometimes can give that extra push into +60fps average.


    I think it becomes important when the game you're playing hovers at 24+ fps, that minor/major OC might make your game from stutter hell to just about playable. I had a game before that I was able to set to mid-high settings but the fps hovered at 25fps, the minor OC helped me push 30fps+.

    At that time that was the only game I was playing and spending money to get a vid card to play this one game imho wasn't worth it. The viable cards that I could buy at that time were 8600GT(this thing is a joke), 8800GTS 320MB(too expensive back then), 2600XT(a worse joke), 2900(a heat monster).
  • -1 Hide
    unknown_2 , July 27, 2009 10:43 AM
    I think they are not using the the ATT correctly.

    As you can see here, I underclock my 4870 to 200/250 with just 0.5v

  • -1 Hide
    unknown_2 , July 27, 2009 10:44 AM
    http://i474.photobucket.com/albums/rr107/agent47_1/hd4890_027.jpg
  • -1 Hide
    unknown_2 , July 27, 2009 10:45 AM
    sorry, copied the wrong link...
    here it is.......

    http://i474.photobucket.com/albums/rr107/agent47_1/att.jpg
  • -5 Hide
    wh3resmycar , July 27, 2009 1:47 PM
    Quote:
    Back in its day, ATI’s Radeon HD 2900 XT was a very fast card.


    the author of the article had to be joking.
  • -5 Hide
    IzzyCraft , July 27, 2009 1:55 PM
    Against the 8800GT it was very fast, fast like a rocket(8800) vs a turtle(2900)
  • 5 Hide
    KT_WASP , July 27, 2009 2:09 PM
    wh3resmycarthe author of the article had to be joking.


    The HD 2900XT was a fast card. Not as fast as Nvidia's flag ship 8800 line-up, but not to far off. They were hot and power hungry, but not slow.

    I think that most people remember the disappointment of the real-world performance of the HD 2900 series after all the hype and projected performance the card had on paper at that time. Going by the specs it looked to be a 8800 killer, but then it came to market and did no such thing. This is what I think people remember the HD 2900 series most by.

    But, if you look past that, it was still a fast card. At that time it was beating CrossFired X1950XTX's and SLI'ed 79050 GT's, and even 7950GX2's in SLI.

    So yes, in its day it was a fast card. Was it the fastest? No, but that doesn't mean it was a slow card by any means.

    I think that people need to remember that at that time Nvidia broke all the molds with their 8800 series. It is a credit to Nvidia that they were able to bring performance levels to a new high. But, don't disregard the 2900 series as slow cards.
  • -1 Hide
    unknown_2 , July 27, 2009 3:40 PM
    2900xt was a fast card. Just that it can a little too late. And during that time, nVidia 8800 line up was really really rocketing.

    I still consider nVvdia 8800GTX as one of the finest, yet most expensive card that I nvr own. 8800GTX is legendary. Even by todays standard, it still rocks.
  • 0 Hide
    bildo123 , July 27, 2009 3:54 PM
    I have a XFX 4870 1GB and using ATI Tray Tools DOES let me underclock AND undervolt my card just fine. In fact I use the awesome auto-overclock feature. Whenever it detects that a game is opened it switches to a profile of your choice, my OC, being 855/1000. Also, if you have an old game, like Diablo 2, that doesn't need a full OC, you can add EXE's to the exception list say it stays on the 'idle' profile. I don't know off the top of my head exactly but at idle, its around 350/450 and the voltage is a tiny pinch above 1v, around 1.083 or something like that. According to GPU-Z the core draws 2.5x less amperage 11A compared to 25.5.
Display more comments