Sign in with
Sign up | Sign in

Conclusion

Radeon HD 5770 And 5750 Review: Gentlemen, Start Your HTPCs
By

I see two types of buyers considering these cards.

First, there are the value-oriented enthusiasts who try to keep their systems updated once a year or so. They’d love the fastest technology, but know that flagships always carry the largest pricing premiums. These are the folks who kept an eye on our Best Graphics Cards for the Money column, and when Radeon HD 4870s hit $140 bucks, they bought (and got a killer deal, even by today’s standards). If you belong to that group and are looking at “Radeon HD 5700-series,” expecting a big step up in performance, even the 5770 is a disappointment. After all, if you own a 4870 or 4890 already, that card is faster in today’s games.

Of course, there’s an X factor in play: ATI’s value-adds. Eyefinity—the ability to run three concurrent display outputs—is completely unique at the high-end still. It’s particularly exciting at the $159 and $129 price points being represented here.  Likewise, the ability to bitstream Dolby TrueHD and DTS-HD MA are capabilities previously available through $200+ sound cards. Now you can get that functionality from a DirectX 11 graphics card. Both extras are compelling enough on their own to sell these cards to the folks able to exploit their benefits today. And it’s personally telling that I’ve put one card in my desktop workstation to drive a trio of monitors, and one into my HTPC, driving a 55” Samsung LED display.

How could you NOT want this on your desk?How could you NOT want this on your desk?

The second group of folks is upgrading from older graphics technology, or perhaps even building a first system. They don’t have a good point of reference, so they’re seeing Radeon HD 5770/5750, Radeon HD 4870, and GeForce GTX 260 on the shelf next to each other for the first time. Available for $145 online, and with consistently better performance than the 5770, ATI’s Radeon HD 4870 remains a good buy. But paying an extra $15 for Eyefinity, bitstreaming, and the promise of DirectX 11 should really be a no-brainer.

What about the Radeon HD 5750 versus GeForce GTS 250 grudge match? Again, Nvidia seems to have the faster GPU, but again, it’d be short-sighted to pass up on ATI’s value-adds at the same price point for a few frames per second. As an aside—and this is going to get me crucified in the comments—but props to Nvidia for designing a GPU that could hang around for more than two years and continue to do battle against modern architectures in modern games and come out ahead.

I think it’s our second-to-last page that’s the most telling here, though. Stepping from a 2.66 GHz Core i5-750 to the same chip running at 3.8 GHz makes almost no difference to the gamers running at 1920x1200. If it means saving a few bucks on a less expensive CPU so that you can step up to a Radeon HD 5850, that’s the move I’d most likely make.

Display all 123 comments.
This thread is closed for comments
Top Comments
  • 31 Hide
    Summer Leigh Castle , October 13, 2009 4:54 AM
    Can we BOLD or change the color of the card that's being reviewed?
Other Comments
  • 31 Hide
    Summer Leigh Castle , October 13, 2009 4:54 AM
    Can we BOLD or change the color of the card that's being reviewed?
  • 2 Hide
    masterjaw , October 13, 2009 5:12 AM
    Nice one, but the charts are a bit cluttered without giving emphasis on the featured cards (bold fonts, etc). A media card that could do games pretty good.

    I'm quite agree with the nvidia's G92 still hanging around but looking at their newly released cards (gt220, 210), I don't know what to say anymore. Hopefully, they're making the right choices at the right time.
  • 0 Hide
    megamanx00 , October 13, 2009 5:13 AM
    Looks to me like the 5770 really needs faster memory speeds, though that would defeat trying to make it cheaper, and perhaps a higher core clock. Perhaps we'll see some factory overclocked cards with memory that can reach a significantly higher speed.
  • 1 Hide
    JohnnyLucky , October 13, 2009 5:17 AM
    Power consumption, temperature, and noise levels are very encouraging. I just finished reading other reviews where the 5700 cards are described as mid-level and mainstream cards.
  • 1 Hide
    buzznut , October 13, 2009 5:22 AM
    If I was building today (htpc), I would still go with a HD4670. Who knows six months from now...
    Those other features are compelling. If I could afford 2 more monitors that is.
  • 1 Hide
    cangelini , October 13, 2009 5:23 AM
    Summer Leigh CastleCan we BOLD or change the color of the card that's being reviewed?


    For sure--I've looked into this and would be happy to implement, but haven't had much luck. Any Excel gurus able to get only certain axis labels bolded without changing the entire series?
  • 1 Hide
    noob2222 , October 13, 2009 5:28 AM
    Quote:
    and bitstreaming HD audio in an HTPC (a reason to buy a second card for the living room).


    Personally I use my main computer as my HTPC, after all, I can't play games and watch movies from 2 different rooms at the same time, and all it takes is the HDMI cable (at least until they make it wireless.)
  • 1 Hide
    cangelini , October 13, 2009 5:35 AM
    That works as well. But for someone with a triple-head setup *and* an HTPC, I can justify both usage models.
  • 9 Hide
    lashabane , October 13, 2009 5:52 AM
    I'm looking to upgrade from my dated 3850 and was thinking that these would really impress me for the price. I'm thinking I'll just spend the bit extra and get the 5850 when the prices come down.

    Of course, I wouldn't have been able to make such an informed decision so early if it weren't for TH and columnists such as yourself.

    Thanks for another great article Chris.
  • -4 Hide
    ambientmf , October 13, 2009 5:53 AM
    What's the benefit of DirectX 11 capabilities if the cards are worse performing than last gen cards in DX9/10 games? I'd rather get a 4800 series card, being a gamer myself, for slightly better framerates.
    I can see the other benefits for the hardcore HTPC crowd though.
  • 2 Hide
    greglouganis , October 13, 2009 5:53 AM
    Question... Why are the power consumption values in comparison to the GTS 250 in this review so different from the ones posted here: http://www.tomshardware.com/reviews/geforce-gts-250,2172-10.html ? This 5770 review lists the system at load with GTS 250 within a handful of watts of the system with an HD 4870 or GTX 260, while the older review (and many other sources of information) seem to suggest at least 20-30 Watt gap.

    I'm running a GTS 250 1 GB on my PC just fine at the moment (but cutting it close), and I was under the impression that I would need to upgrade my power supply as well if I changed to anything more powerful than it/hd 4850... The main reason I was so interested in this article was to see if a 5770 would be worthwhile upgrade (I don't intend to replace PSU anytime soon), but this data here seems to suggest that I would fine jumping up to a 5850!
  • 4 Hide
    DjEaZy , October 13, 2009 6:06 AM
    ... if it's ATi, give some credit to AMD and do a AMD based machine too... pretty please?
  • 4 Hide
    deadlockedworld , October 13, 2009 6:22 AM
    I would add a third group of potential buyers: people looking for low power consumption, or seeking to maximize performance on a 400-450w psu?

    I would have liked to see the old 4850 in here too, even though its similar to the 4770..
  • 0 Hide
    CoryInJapan , October 13, 2009 6:22 AM
    I got my 4870 OC'd to 4890 specs almost a month ago.I dont feel smug at all because I got it for 112 bucks open box brand spankin new and out performs the 5750 and 70 so Im cool. .....for now...
  • 1 Hide
    cangelini , October 13, 2009 6:30 AM
    greglouganisQuestion... Why are the power consumption values in comparison to the GTS 250 in this review so different from the ones posted here: http://www.tomshardware.com/review [...] 72-10.html ? This 5770 review lists the system at load with GTS 250 within a handful of watts of the system with an HD 4870 or GTX 260, while the older review (and many other sources of information) seem to suggest at least 20-30 Watt gap.I'm running a GTS 250 1 GB on my PC just fine at the moment (but cutting it close), and I was under the impression that I would need to upgrade my power supply as well if I changed to anything more powerful than it/hd 4850... The main reason I was so interested in this article was to see if a 5770 would be worthwhile upgrade (I don't intend to replace PSU anytime soon), but this data here seems to suggest that I would fine jumping up to a 5850!


    Greg, we switched testing methodology for power consumption earlier in the year--I suspect this is where the gap comes from.

    The GTS 250 has a maximum board power of 150W. Given the 5850's revised board power of 151W, I suspect you'd be in great shape if you upgraded to that one at some point without a power supply problem (so long as you have something in the 450W range?)
  • 0 Hide
    cangelini , October 13, 2009 6:30 AM
    lashabaneI'm looking to upgrade from my dated 3850 and was thinking that these would really impress me for the price. I'm thinking I'll just spend the bit extra and get the 5850 when the prices come down.Of course, I wouldn't have been able to make such an informed decision so early if it weren't for TH and columnists such as yourself.Thanks for another great article Chris.


    Thanks Lash--glad you enjoyed the story!
  • 0 Hide
    Proximon , October 13, 2009 6:45 AM
    I continue to be in awe of your conclusion writing skills, Chris. You always observe something interesting and useful.

    One thing I haven't seen mentioned... can you double up Eyefinity with two cards, for 6 monitors? We get traders on the forums regularly looking for ways to get 5 or 6 monitors on a budget.
  • 0 Hide
    cangelini , October 13, 2009 7:47 AM
    Thanks much Prox
  • 0 Hide
    Sihastru , October 13, 2009 7:58 AM
    ProximonOne thing I haven't seen mentioned... can you double up Eyefinity with two cards, for 6 monitors? We get traders on the forums regularly looking for ways to get 5 or 6 monitors on a budget.
    No, and not even in CFX, at least not in a way to combine the resolution... you could run 3 of them independently of the other 3, but where's the fun in that? I think they are doing it on purpose to protect their upcoming 6 mini display port card that should have a nice price premium for that software "functionality".
  • 0 Hide
    randomizer , October 13, 2009 8:15 AM
    This is the only review I've seen which shows Batman with PhysX enabled (HardOCP was the only other site I found that used Batman in their review but without PhysX). That ~15FPS cap is very interesting. How did you go about enabling PhysX in this? Did you use the "hack" to run it on the CPU?
Display more comments