GeForce GTX 570 Review: Hitting $349 With Nvidia's GF110

Conclusion

Our opinion of Nvidia's $349 GeForce GTX 570 changes depending on the frame of reference. Compared to Nvidia’s GeForce GTX 480—the short-lived flagship—GeForce GTX 570 is a home run. It’s faster, cooler, quieter, and less expensive. Relative to GeForce GTX 580, the 570 performs and is priced appropriately.

We know that the GTX 480 is already off of Nvidia’s product lineup, and we have to imagine the company is anxious to discontinue anything currently based on GF100. That means the 480 will need to drop more than $100 to even be a consideration. I can’t imagine value-seekers touching it anywhere north of $300. The 470 already saw its massive haircut ahead of AMD’s Radeon HD 6800-series launch. At $250ish, that’s a fairly appropriate price.

But then there’s the competition. AMD can’t universally beat the GeForce GTX 580 with a pair of Radeon HD 6850s, but it certainly trades blows with the $500 card for a lot less money. Of course, that means the $30 extra you pay for two 6850s is worthwhile compared to Nvidia’s GeForce GTX 570. At the same time, AMD shoots its own Radeon HD 5970 in the foot. If it’s not worth paying $500 for a GTX 580, then it’s not worth paying $500 (or more, since only one model sells for that price) for a 5970 when you can get these newer cards for $380.

Then again, this assumes CrossFire is an option for you. GeForce GTX 570 is perhaps most attractive to the enthusiasts who don’t want to pay flagship prices, but only have room for a single dual-slot card. AMD doesn’t have anything that can compete right now in that price and performance range. You’re either looking at $500+ for 5970 or $280 for Radeon HD 5870. We’re still waiting on the Radeon HD 6900-series to see what’ll end up falling between.

Do we recommend waiting for the Cayman-based parts? On principle, I want to say no. After all, we gave AMD a free pass in our GeForce GTX 580 review one month ago, recommending enthusiasts wait for the planned November 22nd launch before committing. After a confirmed delay and no new official word from AMD when those parts will hit store shelves, it’s most fair to say keep moving. But we have enough inside information to know when the cards should be arriving here in Bakersfield; it still seems silly not to wait just a little bit longer. Unofficially, AMD is expected to make a pre-Christmas launch with shipping product, but availability is going to be tight, which means cards will sell out fast and probably be priced beyond the MSRP. This one has the potential to be a real circus, and we can’t say we envy anyone hoping to battle for a 6900-series board before the beginning of 2011.

The bottom line today is that Nvidia has a solid solution for anyone looking to run one high-performance graphics card. It incorporates the things we liked about GeForce GTX 580 for $150 less. AMD has a better value in two CrossFire’d Radeon HD 6850s, but you have to be willing to give up four expansion slots worth of space on your CrossFire-capable motherboard.

In response to GeForce GTX 570, the GTX 480 will need to be swept under a rug by board partners with cards on-hand still. I’ve said it before: GF110 is what GF100 should have been. While Nvidia would have undoubtedly been in a better position with this graphics processor nine months ago, at least we have it now—and it’s a undeniably a strong performer with significantly better thermal and acoustic properties than GeForce GTX 480—all for $100 less.

The next two weeks will determine whether the two top GF110-based cards, GeForce GTX 580 and 570, arrived just in time to fill a hole left by AMD’s GPU lineup or just in time to get leapfrogged by the competition’s latest creation.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
108 comments
    Your comment
    Top Comments
  • nevertell
    Yay, I got highlighted !
    32
  • sstym
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
    26
  • Other Comments
  • thearm
    Grrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.
    -9
  • xurwin
    at $350 beating the 6850 in xfire? i COULD say this would be a pretty good deal, but why no 6870 in xfire? but with a narrow margin and if you need cuda. this would be a pretty sweet deal, but i'd also wait for 6900's but for now. we have a winner?
    5
  • nevertell
    Yay, I got highlighted !
    32
  • verrul
    because 2 6850s is pretty equal in price to the 570
    5
  • sstym
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
    26
  • nevertell
    It's disappointing to see the freaky power/temperature parameters of the card when using two different displays. I was planing on using a display setup similar to that of the test, now I am in doubt.
    8
  • reggieray
    I always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.
    Or am I missing something?
    -2
  • reggieray
    PS Excellent Review
    6
  • theholylancer
    hmmm more sexual innuendo today than usual, new GF there chris? :D

    EDIT:

    Love this gem:
    Quote:

    Before we shift away from HAWX 2 and onto another bit of laboratory drama, let me just say that Ubisoft’s mechanism for playing this game is perhaps the most invasive I’ve ever seen. If you’re going to require your customers to log in to a service every time they play a game, at least make that service somewhat responsive. Waiting a minute to authenticate over a 24 Mb/s connection is ridiculous, as is waiting another 45 seconds once the game shuts down for a sync. Ubi’s own version of Steam, this is not.


    When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.
    3
  • amk09
    nevertellYay, I got highlighted !


    So you gonna buy it? Huh huh huh?
    2
  • nevertell
    I was planning on doing so, but I didn't get enough money from the work I was doing, so I'll stick with just a new monitor. I will definitely get a new card during the next year, but not for now :( And by then, there might be new great cards out there.
    -1
  • lostandwandering
    Good looking performance numbers. Will be interesting to see what this does in the real world to NVidia's pricing of the GTX 400 series.
    1
  • phantomtrooper
    why is the dual-gpu 5970 in this comparison? why is the 6850 crossfire? u have two ati dual-gpu solutions, but no nvidia dual-gpu solutions. biased much?
    -20
  • SininStyle
    Wow, no sli 460s included? Yet you include 6850s in xfire? really? *facepalm* fail
    -5
  • darkchazz
    now I want gtx 560
    2
  • anacandor
    While the 5xx series is looking decent so far, it seems to me (pure speculation here) that Nvidia held back with this series and are possibly putting more resources into Kepler. I feel this because they aren't trying to kill AMD for market share, instead put up a perfectly resonable product that neither EXCELS vastly beyond last gen, but providing enough performance to justify a new product. That said i'm looking forward to their 2011 lineup.

    Also, it would have been interesting to see Metro 2033 tested with max instead of medium settings. All the cards are able to play medium at all resolutions with no AA... push them to their limits? :)

    Thoroughly enjoyable review though. Thanks, Chris!
    3
  • gxpbecker
    i LOVE seeing Nvidia and AMD trading blows back and forth. Keeps prices in check lol and gives more optiosn for buyers!!!
    4
  • kg2010
    You can see how the 460's in SLI did here vs the 580
    http://benchmarkreviews.com/index.php?option=com_content&task=view&id=614&Itemid=72

    But yeah, this review NEEDS 460's 1GB in SLI to be fair, as they are definitely an alternative to a 580, even a 570. There are quite a few cards at or below $199

    Dual Hawks for $320 AFTER MIR:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127518

    And these cards will overclock well.
    1
  • tronika
    ReggieRayI always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.Or am I missing something?

    noticed that too. i really can't think of any reason other than the language support for the Tom's engineers. 99% of the gamer market would be better off with home premium 64 bit. the other 1% that actually runs and maintains a domain in their house should get professional or the bloated "ultimate". i mean, who really uses bitlocker on their gaming machine anyway? great article though! i jumped on the 580 and haven't looked back. used to run on a 5850 but now that i've seen fermi in all of it's glory i'm really interested in nvidia's future "vision".
    0
  • cangelini
    theholylancerhmmm more sexual innuendo today than usual, new GF there chris? EDIT:Love this gem:When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.


    Wait, innuendo? Where? :)
    1