Sign in with
Sign up | Sign in

Radeon HD 5770 And 5750 Review: Gentlemen, Start Your HTPCs

Radeon HD 5770 And 5750 Review: Gentlemen, Start Your HTPCs
By

The last 30 days have seen a ton of new technology, from Intel’s Lynnfield-based Core i5 and Core i7 processors (which we reviewed here, tested in a number of different games with CrossFire and SLI setups here, and measured the effect of integrated PCI Express 2.0 right here) to ATI’s Cypress graphics processor (manifest through the Radeon HD 5870 and Radeon HD 5850). Between those launch stories, I’ve run thousands of benchmark numbers and written tens of thousands of words. Thus, when I sat down to write this Radeon HD 5770/5750 review (after running another 500+ tests), I had to mix it up a bit and have a little fun with the intro. Feel free to read while listening to Biz Markie’s Just A Friend.

Have you ever seen a card that you wanted to buy?
Killer performance, but a price sky-high?
Let me tell you a story of my situation;
I game on PCs, forget Playstation.
The tech that I like is really high-end.
But I gotta get by with a couple Benjamins.
I upgrade once a year, whenever I can.
Processors, hard drives, graphics cards, RAM.
i7 looked great; I bought i5.
Now it’s time for new graphics; make my games look live.
I know of Nvidia; I know ATI.
So many boards between ‘em, makes me want to cry.
G92’s been around
, and that’s a fact.
Couldn’t find 740
; that launch was whack.
But I’ve pulled out my wallet out and I’m ready to buy.
I want something new; no shrunken die.
Read Chris’ Cypress story
; that card looked hot
If I had four bones, it’d already be bought.

Come onnnnnn, I can’t even afford that.
I’m looking for something under $200, man.

And here’s where ATI chimes in…

We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend? Oh gamer…
We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend? Oh gamer…
We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend?

Last Year’s Flagship Is This Year’s Mid-Range

Meet the Radeon HD 5770...Meet the Radeon HD 5770...

If the Radeon HD 5870 was characterized by roughly twice the computing resources as Radeon HD 4870, then the Radeon HD 5770 represents a halving of Radeon HD 5870. You’d think that’d yield something that looks a lot like the Radeon HD 4870 to which you’re already accustomed—and you’d be close to correct.

The Radeon HD 4870 is based on ATI’s 55nm RV770, sporting 956 million transistors on a 260 square millimeter die. It boasts 800 ALUs (shader processors), 40 texture units, a 256-bit memory interface armed with GDDR5 memory (cranking out 115.2 GB/s), and a depth/stencil rate of 64 pixels per clock.

...and the Radeon HD 5750...and the Radeon HD 5750

In contrast, ATI’s 40nm Juniper GPU is made up of 1.04 billion transistors. It also wields 800 shader processors, 40 texture units, and a depth/stencil rate of 64 pixels per clock. But its memory interface, being a halved version of Cypress,’ is only 128-bits wide. Nevertheless, ATI arms it with GDDR5 memory able to move up to 76.8 GB/s.

Right off the bat, we knew that this was going to be a very tough comparison—not only between ATI and Nvidia, but also between ATI and its own lineup of products. Yes, both of these new cards leverage DirectX 11 support. They both offer three digital display outputs split between DVI, HDMI, and DisplayPort connectors. And the pair is able to bitstream Dolby TrueHD and DTS-HD Master Audio from your home theater PC to your compatible receiver via HDMI 1.3, too.

But with specs that look roughly on par with the Radeon HD 4870 and Radeon HD 4770, anyone who recently purchased one of those previous-generation boards is bound to feel smug about the performance we see in this write-up—at least until DirectX 11 applications start emerging in greater numbers.

So, what’s the verdict? Is the Radeon HD 5770 worth paying $160 for amongst $145 Radeon HD 4870s? Is the 1GB Radeon HD 5750 worth its $129 price tag in comparison to the $120 Radeon HD 4770 (with 512MB) or even Nvidia’s GeForce GTS 250 at a similar price? Let’s dig into the speeds, feeds, numbers, and multimedia tests for more.

Display all 123 comments.
This thread is closed for comments
Top Comments
  • 31 Hide
    Summer Leigh Castle , October 13, 2009 4:54 AM
    Can we BOLD or change the color of the card that's being reviewed?
Other Comments
  • 31 Hide
    Summer Leigh Castle , October 13, 2009 4:54 AM
    Can we BOLD or change the color of the card that's being reviewed?
  • 2 Hide
    masterjaw , October 13, 2009 5:12 AM
    Nice one, but the charts are a bit cluttered without giving emphasis on the featured cards (bold fonts, etc). A media card that could do games pretty good.

    I'm quite agree with the nvidia's G92 still hanging around but looking at their newly released cards (gt220, 210), I don't know what to say anymore. Hopefully, they're making the right choices at the right time.
  • 0 Hide
    megamanx00 , October 13, 2009 5:13 AM
    Looks to me like the 5770 really needs faster memory speeds, though that would defeat trying to make it cheaper, and perhaps a higher core clock. Perhaps we'll see some factory overclocked cards with memory that can reach a significantly higher speed.
  • 1 Hide
    JohnnyLucky , October 13, 2009 5:17 AM
    Power consumption, temperature, and noise levels are very encouraging. I just finished reading other reviews where the 5700 cards are described as mid-level and mainstream cards.
  • 1 Hide
    buzznut , October 13, 2009 5:22 AM
    If I was building today (htpc), I would still go with a HD4670. Who knows six months from now...
    Those other features are compelling. If I could afford 2 more monitors that is.
  • 1 Hide
    cangelini , October 13, 2009 5:23 AM
    Summer Leigh CastleCan we BOLD or change the color of the card that's being reviewed?


    For sure--I've looked into this and would be happy to implement, but haven't had much luck. Any Excel gurus able to get only certain axis labels bolded without changing the entire series?
  • 1 Hide
    noob2222 , October 13, 2009 5:28 AM
    Quote:
    and bitstreaming HD audio in an HTPC (a reason to buy a second card for the living room).


    Personally I use my main computer as my HTPC, after all, I can't play games and watch movies from 2 different rooms at the same time, and all it takes is the HDMI cable (at least until they make it wireless.)
  • 1 Hide
    cangelini , October 13, 2009 5:35 AM
    That works as well. But for someone with a triple-head setup *and* an HTPC, I can justify both usage models.
  • 9 Hide
    lashabane , October 13, 2009 5:52 AM
    I'm looking to upgrade from my dated 3850 and was thinking that these would really impress me for the price. I'm thinking I'll just spend the bit extra and get the 5850 when the prices come down.

    Of course, I wouldn't have been able to make such an informed decision so early if it weren't for TH and columnists such as yourself.

    Thanks for another great article Chris.
  • -4 Hide
    ambientmf , October 13, 2009 5:53 AM
    What's the benefit of DirectX 11 capabilities if the cards are worse performing than last gen cards in DX9/10 games? I'd rather get a 4800 series card, being a gamer myself, for slightly better framerates.
    I can see the other benefits for the hardcore HTPC crowd though.
  • 2 Hide
    greglouganis , October 13, 2009 5:53 AM
    Question... Why are the power consumption values in comparison to the GTS 250 in this review so different from the ones posted here: http://www.tomshardware.com/reviews/geforce-gts-250,2172-10.html ? This 5770 review lists the system at load with GTS 250 within a handful of watts of the system with an HD 4870 or GTX 260, while the older review (and many other sources of information) seem to suggest at least 20-30 Watt gap.

    I'm running a GTS 250 1 GB on my PC just fine at the moment (but cutting it close), and I was under the impression that I would need to upgrade my power supply as well if I changed to anything more powerful than it/hd 4850... The main reason I was so interested in this article was to see if a 5770 would be worthwhile upgrade (I don't intend to replace PSU anytime soon), but this data here seems to suggest that I would fine jumping up to a 5850!
  • 4 Hide
    DjEaZy , October 13, 2009 6:06 AM
    ... if it's ATi, give some credit to AMD and do a AMD based machine too... pretty please?
  • 4 Hide
    deadlockedworld , October 13, 2009 6:22 AM
    I would add a third group of potential buyers: people looking for low power consumption, or seeking to maximize performance on a 400-450w psu?

    I would have liked to see the old 4850 in here too, even though its similar to the 4770..
  • 0 Hide
    CoryInJapan , October 13, 2009 6:22 AM
    I got my 4870 OC'd to 4890 specs almost a month ago.I dont feel smug at all because I got it for 112 bucks open box brand spankin new and out performs the 5750 and 70 so Im cool. .....for now...
  • 1 Hide
    cangelini , October 13, 2009 6:30 AM
    greglouganisQuestion... Why are the power consumption values in comparison to the GTS 250 in this review so different from the ones posted here: http://www.tomshardware.com/review [...] 72-10.html ? This 5770 review lists the system at load with GTS 250 within a handful of watts of the system with an HD 4870 or GTX 260, while the older review (and many other sources of information) seem to suggest at least 20-30 Watt gap.I'm running a GTS 250 1 GB on my PC just fine at the moment (but cutting it close), and I was under the impression that I would need to upgrade my power supply as well if I changed to anything more powerful than it/hd 4850... The main reason I was so interested in this article was to see if a 5770 would be worthwhile upgrade (I don't intend to replace PSU anytime soon), but this data here seems to suggest that I would fine jumping up to a 5850!


    Greg, we switched testing methodology for power consumption earlier in the year--I suspect this is where the gap comes from.

    The GTS 250 has a maximum board power of 150W. Given the 5850's revised board power of 151W, I suspect you'd be in great shape if you upgraded to that one at some point without a power supply problem (so long as you have something in the 450W range?)
  • 0 Hide
    cangelini , October 13, 2009 6:30 AM
    lashabaneI'm looking to upgrade from my dated 3850 and was thinking that these would really impress me for the price. I'm thinking I'll just spend the bit extra and get the 5850 when the prices come down.Of course, I wouldn't have been able to make such an informed decision so early if it weren't for TH and columnists such as yourself.Thanks for another great article Chris.


    Thanks Lash--glad you enjoyed the story!
  • 0 Hide
    Proximon , October 13, 2009 6:45 AM
    I continue to be in awe of your conclusion writing skills, Chris. You always observe something interesting and useful.

    One thing I haven't seen mentioned... can you double up Eyefinity with two cards, for 6 monitors? We get traders on the forums regularly looking for ways to get 5 or 6 monitors on a budget.
  • 0 Hide
    cangelini , October 13, 2009 7:47 AM
    Thanks much Prox
  • 0 Hide
    Sihastru , October 13, 2009 7:58 AM
    ProximonOne thing I haven't seen mentioned... can you double up Eyefinity with two cards, for 6 monitors? We get traders on the forums regularly looking for ways to get 5 or 6 monitors on a budget.
    No, and not even in CFX, at least not in a way to combine the resolution... you could run 3 of them independently of the other 3, but where's the fun in that? I think they are doing it on purpose to protect their upcoming 6 mini display port card that should have a nice price premium for that software "functionality".
  • 0 Hide
    randomizer , October 13, 2009 8:15 AM
    This is the only review I've seen which shows Batman with PhysX enabled (HardOCP was the only other site I found that used Batman in their review but without PhysX). That ~15FPS cap is very interesting. How did you go about enabling PhysX in this? Did you use the "hack" to run it on the CPU?
Display more comments