Sign in with
Sign up | Sign in

ATI Radeon HD 5770: Memory With A Need For Speed

Radeon HD 5770, Radeon HD 4890, And GeForce GTX 275 Overclocked
By

Click the image below to launch the image gallery For ATI's Radeon HD 5770.

Wearing a black shroud, the Radeon HD 5770 is a rather inconspicuous card. Don’t be fooled, though. Beneath that classy exterior lies a veritable panther with very sharp claws. Our test results are clear: the Radeon HD 5770 is perfectly capable of replacing the Radeon HD 4870. This new card is quieter than its predecessor and draws less power, too. Indeed, both in 2D and in 3D mode, the HD 5770 consumes about 50 watts less power than the 4870, while offering virtually the same performance in 3D games. Since it is manufactured on a 40nm process, the GPU never runs at more than 69 degrees Celsius, which isn’t much of a challenge for the cooler. Consequently, the card purrs along at an audible but unobtrusive 39 db(A).

Actually, we’d be hard pressed to identify a single drawback compared to the older card. What’s not to like? The new model consumes less power, produces less heat, runs quieter, supports DirectX 11, and offers performance that rivals that of the HD 4870, even with the first batch of drivers. Connectivity is good, too. The card sports two DVI connectors, one HDMI, and one DisplayPort output.

The real focus of this article is overclocking MSI’s GTX 275 and HD 4890. However, since we had a reference Radeon HD 5770 handy, we decided this would be a good opportunity to see how far we could push ATIs newest generation of mainstream cards. At stock speeds, our reference card clocks the GPU at 850 MHz and the memory at 1,200 MHz. ATI opts for GDDR5 memory for all members of the Radeon HD 5000-series (so far), although the memory bus is only 128-bits wide on this model.

We used ATI's own Catalyst graphics driver for our overclocking experiments, since its Overdrive utility offers plenty of headroom. As it turned out, our sample’s GPU was already close to its limit. We started off with a GPU frequency of 920 MHz, which we had to reduce to 895 MHz to stabilize the card. The memory was a completely different matter, though, and offered exceptional potential. Back in the days of the 4000-series, you considered yourself lucky if you managed to squeeze out an extra 10 percent margin by reaching 1,200 MHz. Our Radeon HD 5770 allowed us to set the memory frequency to no less than 1,445 MHz, which we had to lower slightly to 1430 MHz to ensure stability. An overclock from 1,200 MHZ to 1,430 MHz means a clock speed increase of 19.2 percent. Not bad at all for a mainstream card. 

Frequency in MHz
GPU
Percent
Memory
Percent
ATI Radeon HD5770 Overclocked
895105.31,430119.2
ATI Radeon HD5770
850100.01,200100.0


As mentioned, the Radeon HD 5770 is already tied with the HD 4870 at stock speeds. Once it is overclocked, though, it gains another eight percent. That’s still a few percentage points short of the Radeon HD 4890's performance, but certanily not an insurmountable gap that could conceivably made up by driver-based improvements to the 5770.

However, the really good news here is that overclocking has no adverse effect on 2D power consumption or noise, since the card reverts to the same lowered clock speeds as before. Under load, its temperature only increases from 69 to 72 degrees Celsius, causing the fans noise level to rise from 39 to 40.5 db(A). But here’s the surprise: power consumption only increases by a mere 6 watts, showcasing the benefits of a 40nm production process. 

Graphics Card and Chip Class
FPS
Percent
Radeon HD 4890 (1,024 MB)1,523.6114.8
ATI Radeon HD 5770 Overclocked (1,024 MB)1,433.5108.0
ATI Radeon HD 5770 (1,024 MB)1,332.9100.4
Radeon HD 4870 (512 MB)1,327.1100.0


ATI has outdone itself. The Radeon HD 5770 is really a fully-featured card, combining 1GB of fast GDDR5 memory, a much lower power consumption, low noise levels, an optimized driver with a real 2D mode as well as good overclocking capabilities, a well-balanced fan speed profile, HDMI and DisplayPort connectivity in conjunction with DVI, and DirectX 11 support.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 70 comments.
This thread is closed for comments
Top Comments
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
Other Comments
  • 1 Hide
    amdgamer666 , November 24, 2009 5:22 AM
    Nice article. Ever since the 5770 came out I've been wondering how far someone could push the memory to relieve that bottleneck. Being able to push it to 1430 allows it to be competitive to it's older sibling and makes it enticing (with the 5700 series' extra features of course)
  • 1 Hide
    Onyx2291 , November 24, 2009 5:30 AM
    Damn some of these cards run really well for 1920x1200 which I run at. Could pick up a lower one and run just about anything at a decent speed if I overclock well. Good ol charts :) 
  • 9 Hide
    skora , November 24, 2009 5:47 AM
    If you're trying to get to the next cards performance by OCing, shouldn't the 5850 be benched also? I know the 5770 isn't going to get there because of the memory bandwidth issue, but you missed the mark. One card is compared to its big brother, but the other two aren't.

    I am glad to see the 5770 produce playable frame rates at 1920x1200. Nice game selection also.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
  • 5 Hide
    presidenteody , November 24, 2009 6:26 AM
    I don't care what this article says, when the 5870 or 5970 become available i am going to buy a few.
  • 0 Hide
    kartu , November 24, 2009 6:27 AM
    Well, at least in Germany 4870 costs quite a bit less (30-40 Euros) compared to 5770. It would take 2+ years of playing to compensate for it with lower power consumption.
  • -3 Hide
    kartu , November 24, 2009 6:30 AM
    "Power Consumption, Noise, And Temperature" charts are hard to comprehend. Show bars instead of numbers, maybe?
  • -3 Hide
    arkadi , November 24, 2009 7:08 AM
    Well that put things in prospective. I was really happy with 260gtx numbers, and i can push my evga card even higher easy. To bad we didn't see the 5850 here, it looks like the optimal upgrade 4 gamers on the budget like my self. Grade article overall.
  • 0 Hide
    B16CXHatch , November 24, 2009 7:08 AM
    I got lucky with my card. Before, I had a SuperClocked 8800GT from EVGA. I ordered a while back, a new EVGA GeForce GTX 275 (896MB). I figured the extra cash wasn't worth getting an overclocked model particularly when I could do it myself. I get it, I try to register it. The S/N on mine was a duplicate. They sent me an unused S/N to register with. I then check the speeds under one utility and it's showing GTX 275 SuperClocked speeds, not regular speeds. I check 2 more utilities and they all report the same. I had paid for a regular model and received a mislabeled SuperClocked. Flippin sweet.

    Now they also sell an SSC model which is overclocked even more. I used the EVGA precision tool to set those speeds and it gave me like 1 or 2 extra FPS is Crysis and F.E.A.R. 2 already played so well without overclocking. So overclocking on these bad boys doesn't really do much. Oh well.

    One comment though, GTX 275's are HOT! Like, ridiculously hot. I open my window in 40 degree F weather and it'll still get warm in my room playing Team Fortress 2.
  • 3 Hide
    Anonymous , November 24, 2009 7:40 AM
    With the 5970 out there seems to be nothing else about graphic cards that interests me anymore :D  Its supposed to be the fastest card yet and beats Crysis too!
  • -3 Hide
    Anonymous , November 24, 2009 7:48 AM
    Excellent article [hindered by poor chart].
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • -8 Hide
    notty22 , November 24, 2009 10:21 AM
    The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.
  • 0 Hide
    brisingamen , November 24, 2009 11:17 AM
    the 5770 has great overclocking potential with the stock cooler, with a good cooler the numbers could be phenominal and in crossfire situation really be nice, also not to mention it is direct x 11, and can do things both the 4890 and 275 cannot. deals will be availible on the 5770 sooner than any of the higher models. im considering getting two and overclocking the shinanigans out of them, the 275 spanks nothing with its old tech, IQ matters.
  • 0 Hide
    sparky13 , November 24, 2009 11:32 AM
    I think a better 4890 to use instead of the MSI would be the Gigabyte 4890 OC model I have in my system right now. That MSI cooler is decent but the way it's secured to the GPU is just pitiful.

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/24770-value-meets-performance-hd-4890-cards-gigabyte-msi-19.html

    The Gigabyte comes w/a Zalman cooler and is Factory OC'd to 900mhz. I pushed it to 975mhz it it didn't break a sweat. Idle temps hover around 30-34 C. Under load it rarely breaks 52 C. The Zalman is a beast. It stays quiet too, barely audible under my tricool LED fans on low setting. I reccommend it to anyone looking for a GPU in the 170.00 range.
  • 4 Hide
    scrumworks , November 24, 2009 12:00 PM
    falchardBenchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.


    I haven't seen a single review from the author that wouldn't be somehow made selectively nvidia biased. Last Remnant, HAWX DX10.0, no HD5870/HD5970 are just quick examples. Reviewers should stay absolutely neutral in these matters and arrange proper conditions for all parties.

    I won't analyze any deeper of the results but it seems like Radeon's don't perform quite as well as they should perform in many other reviews.
  • 1 Hide
    cinergy , November 24, 2009 12:08 PM
    notty22The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.


    I guess they are if you don't care about DX11 and lower power consumption readings. I think AMD can easily drop HD5x00 prices after supply starts exceeding demand.
  • 7 Hide
    cknobman , November 24, 2009 12:44 PM
    nice article.............that is if your in the nvidia camp. Gotta love your sponsors right???

    Guess I need to go to anand or tweaktown to get a non nvidia biased review.

    Dont come back giving me some crap about you cant help what games favor nvidia because you can......dont include them in a damn review of ati cards!!!!!! How come you just so happened to exclude every game that favors ATI???

    Your a tool and a fool Kreiss!!
  • -4 Hide
    siliconchampion , November 24, 2009 1:56 PM
    Whoa, people, people!

    Perhaps in stead of flaming the author out about his choice of Nvidia favoring titles, perhaps you could make some helpful suggestions of game titles you would like to see benchmarked...

    Personally, I would love to see some CoD4 and MW2 benchies, but that's just me.
Display more comments