Sign in with
Sign up | Sign in

Radeon HD 5770, Radeon HD 4890, And GeForce GTX 275 Overclocked

Radeon HD 5770, Radeon HD 4890, And GeForce GTX 275 Overclocked
By

Writing an article about overclocking is always a mixed blessing. On one hand, it gives us an excuse to push hardware to its limits and enjoy the feeling of garnering extra performance for free. On the other hand, we’re always very aware that our readers may not be able to achieve the same levels of success because there are so many factors involved.

Cooling is a good example. While the reference designs that grace the first cards to hit the market do their jobs well enough, after-market solutions tend to yield better results, as do models carrying a manufacturer’s own design. Heatpipes are good, and more heatpipes tend to be better. Along the same lines, a bigger heat sink is also beneficial, as the thermal load can be dissipated over a larger area. This is a very important point, since a reference cooler is designed to keep the GPU at a temperature of about 80 to 90 degrees Celsius at stock speeds while  (hopefully) producing only moderate noise.

If you go and increase your GPU's core frequency, resulting in higher heat output, the fan will become progressively louder, spinning up until it reaches its maximum speed in an effort to keep the graphics chip from overheating. Once this point is reached, the card is running at the highest clock speed the cooler will enable. Usually, stock coolers only provide a little headroom. And while the fan will try to cope with the additional heat by blowing more air onto the cooler, it can only do so much with the surface area it has at its disposal.

Something similar happens when you have two brawny graphics cards working together in an SLI or CrossFire setup. Although the cards' fans spin up from 85 to 100 percent duty cycle, they end up fighting a losing battle against the heat output of two cards running at full blast. The problem here is airflow. Too little cool air enters the case, so what little air there is inside the enclosure gets circulated again and again, heating up in the process. At some point, the air gets so warm that the cooler effectively stops cooling the GPU, resulting in a temperature buildup in the GPU, as well as other components.

To recap: the higher the ambient temperature in the case, the harder it is to keep the graphics card cool.

Two more points to consider are the card’s BIOS and its graphics drivers. Ideally, both should support overclocking and clock scaling (lowering the clock speeds for the GPU and memory when the card is in 2D mode) where possible. In some cases, card makers don’t do their homework. They overclock the GPU by five percent and sell the card as an OC Edition. While this does boost 3D performance under load, locking the graphics chip in to a higher overclocked frequency means the board doesn't scale down when idle. The only winning party here is your power company. The other side-effect of the GPU being stuck at max speed is that it constantly produces the same amount of heat, whether you’re browsing the Web or playing a graphically-demanding 3D game. The cooler’s fans are never idle, either.

As a rule of thumb, frequencies should drop to 300/600/100 MHz (GPU/shader/memory) in 2D mode on Nvidia cards. For cards built around an ATI chip, scaling will depend on the model. Although the graphics chip is usually down-clocked to 240 or 500 MHz, the GDDR5 memory on older cards tends to stay at its top speed. Thus, overclocking has an immediate effect on idle power consumption of either vendor's products as well, because they keep running at the higher clocks in 2D mode as well.

The final factor, GPU voltage, is also controlled by the card maker. In the worst case, the card’s voltage is locked in place and overclocking such a card will result in very little margin. Although there are some ways to alter GPU voltage using BIOS modifications or utilities like ATI Tool, even these aren’t guaranteed to work for all cards. Also, don’t forget that overclocking your card almost certainly voids your warranty, since you’re running your hardware outside of specifications the manufacturer considers safe. Almost all current graphics cards have a mechanism to prevent overheating. Once they reach 100 degrees Celsius or so, they automatically reduce their clock speed. But no matter what, raising the GPU’s voltage significantly increases the risk of damaging your hardware.

MSI ventured into voltage mod territory with its GeForce GTX 260 Lightning, a card that uses a special driver add-on allowing users to increase the GPU voltage at higher clock speeds to improve stability. MSI seems to have become more cautious with the card’s successor, the GeForce GTX 275 Lightning. Its software now displays a warning explaining that, while changes to certain parameters is possible, they can also result in damage to the hardware.

Our objective today is to attempt to reach the performance level of the next-highest class of graphics hardware through overclocking. We ordered two special models from MSI for this test that come with beefier cooling solutions and are sold as OC editions, namely the GTX 275 Lightning and the HD 4890 Cyclone SOC. As it turns out, reaching our goal was a snap on the GTX 275 Lightning, which can take on a reference GeForce GTX 285 once it is overclocked. ATI’s Radeon HD 4890 is already the fastest single-GPU card in the 4800-series, so we’ll only be comparing the improved overclocked performance to that of the reference card. Our final candidate is ATI’s brand new Radeon HD 5770. Without giving away too much, we can say that this card blew us away, amply demonstrating the scalability of the 40nm production process.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 70 comments.
This thread is closed for comments
Top Comments
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
Other Comments
  • 1 Hide
    amdgamer666 , November 24, 2009 5:22 AM
    Nice article. Ever since the 5770 came out I've been wondering how far someone could push the memory to relieve that bottleneck. Being able to push it to 1430 allows it to be competitive to it's older sibling and makes it enticing (with the 5700 series' extra features of course)
  • 1 Hide
    Onyx2291 , November 24, 2009 5:30 AM
    Damn some of these cards run really well for 1920x1200 which I run at. Could pick up a lower one and run just about anything at a decent speed if I overclock well. Good ol charts :) 
  • 9 Hide
    skora , November 24, 2009 5:47 AM
    If you're trying to get to the next cards performance by OCing, shouldn't the 5850 be benched also? I know the 5770 isn't going to get there because of the memory bandwidth issue, but you missed the mark. One card is compared to its big brother, but the other two aren't.

    I am glad to see the 5770 produce playable frame rates at 1920x1200. Nice game selection also.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
  • 5 Hide
    presidenteody , November 24, 2009 6:26 AM
    I don't care what this article says, when the 5870 or 5970 become available i am going to buy a few.
  • 0 Hide
    kartu , November 24, 2009 6:27 AM
    Well, at least in Germany 4870 costs quite a bit less (30-40 Euros) compared to 5770. It would take 2+ years of playing to compensate for it with lower power consumption.
  • -3 Hide
    kartu , November 24, 2009 6:30 AM
    "Power Consumption, Noise, And Temperature" charts are hard to comprehend. Show bars instead of numbers, maybe?
  • -3 Hide
    arkadi , November 24, 2009 7:08 AM
    Well that put things in prospective. I was really happy with 260gtx numbers, and i can push my evga card even higher easy. To bad we didn't see the 5850 here, it looks like the optimal upgrade 4 gamers on the budget like my self. Grade article overall.
  • 0 Hide
    B16CXHatch , November 24, 2009 7:08 AM
    I got lucky with my card. Before, I had a SuperClocked 8800GT from EVGA. I ordered a while back, a new EVGA GeForce GTX 275 (896MB). I figured the extra cash wasn't worth getting an overclocked model particularly when I could do it myself. I get it, I try to register it. The S/N on mine was a duplicate. They sent me an unused S/N to register with. I then check the speeds under one utility and it's showing GTX 275 SuperClocked speeds, not regular speeds. I check 2 more utilities and they all report the same. I had paid for a regular model and received a mislabeled SuperClocked. Flippin sweet.

    Now they also sell an SSC model which is overclocked even more. I used the EVGA precision tool to set those speeds and it gave me like 1 or 2 extra FPS is Crysis and F.E.A.R. 2 already played so well without overclocking. So overclocking on these bad boys doesn't really do much. Oh well.

    One comment though, GTX 275's are HOT! Like, ridiculously hot. I open my window in 40 degree F weather and it'll still get warm in my room playing Team Fortress 2.
  • 3 Hide
    Anonymous , November 24, 2009 7:40 AM
    With the 5970 out there seems to be nothing else about graphic cards that interests me anymore :D  Its supposed to be the fastest card yet and beats Crysis too!
  • -3 Hide
    Anonymous , November 24, 2009 7:48 AM
    Excellent article [hindered by poor chart].
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • -8 Hide
    notty22 , November 24, 2009 10:21 AM
    The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.
  • 0 Hide
    brisingamen , November 24, 2009 11:17 AM
    the 5770 has great overclocking potential with the stock cooler, with a good cooler the numbers could be phenominal and in crossfire situation really be nice, also not to mention it is direct x 11, and can do things both the 4890 and 275 cannot. deals will be availible on the 5770 sooner than any of the higher models. im considering getting two and overclocking the shinanigans out of them, the 275 spanks nothing with its old tech, IQ matters.
  • 0 Hide
    sparky13 , November 24, 2009 11:32 AM
    I think a better 4890 to use instead of the MSI would be the Gigabyte 4890 OC model I have in my system right now. That MSI cooler is decent but the way it's secured to the GPU is just pitiful.

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/24770-value-meets-performance-hd-4890-cards-gigabyte-msi-19.html

    The Gigabyte comes w/a Zalman cooler and is Factory OC'd to 900mhz. I pushed it to 975mhz it it didn't break a sweat. Idle temps hover around 30-34 C. Under load it rarely breaks 52 C. The Zalman is a beast. It stays quiet too, barely audible under my tricool LED fans on low setting. I reccommend it to anyone looking for a GPU in the 170.00 range.
  • 4 Hide
    scrumworks , November 24, 2009 12:00 PM
    falchardBenchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.


    I haven't seen a single review from the author that wouldn't be somehow made selectively nvidia biased. Last Remnant, HAWX DX10.0, no HD5870/HD5970 are just quick examples. Reviewers should stay absolutely neutral in these matters and arrange proper conditions for all parties.

    I won't analyze any deeper of the results but it seems like Radeon's don't perform quite as well as they should perform in many other reviews.
  • 1 Hide
    cinergy , November 24, 2009 12:08 PM
    notty22The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.


    I guess they are if you don't care about DX11 and lower power consumption readings. I think AMD can easily drop HD5x00 prices after supply starts exceeding demand.
  • 7 Hide
    cknobman , November 24, 2009 12:44 PM
    nice article.............that is if your in the nvidia camp. Gotta love your sponsors right???

    Guess I need to go to anand or tweaktown to get a non nvidia biased review.

    Dont come back giving me some crap about you cant help what games favor nvidia because you can......dont include them in a damn review of ati cards!!!!!! How come you just so happened to exclude every game that favors ATI???

    Your a tool and a fool Kreiss!!
  • -4 Hide
    siliconchampion , November 24, 2009 1:56 PM
    Whoa, people, people!

    Perhaps in stead of flaming the author out about his choice of Nvidia favoring titles, perhaps you could make some helpful suggestions of game titles you would like to see benchmarked...

    Personally, I would love to see some CoD4 and MW2 benchies, but that's just me.
Display more comments