Sign in with
Sign up | Sign in

The BS Of Benchmarking

GeForce GTX 570 Review: Hitting $349 With Nvidia's GF110
By

There is a lot of jockeying for position that goes on behind the scenes in the graphics business. This isn’t new; it’s been going on for years. Things really escalated ahead of the Radeon HD 6800-series launch, though. Nvidia was playing around with pricing and individual overclocked cards, AMD tried to discredit upcoming discounts on GeForce GTX 400-series boards, Nvidia pushed the HAWX 2 demo super hard before the game was available to play, and, apparently, AMD started making some changes to its drivers.

As a general rule, I try to remove myself from anything that doesn’t directly impact the way you and I both play games. After all, do the results of a game that hasn’t launched yet mean anything when they come from a demo? Does it matter if price cuts are temporary if you can realize the value of a discount today? So much of the stupid crap that goes on doesn’t matter, at the end of the day.

An Issue Of Image Quality

Well, a while back, Nvidia approached me about an optimization that had found its way into AMD’s driver, which would demote FP16 render targets, improving performance quantifiably. But after talking to both parties, there were three reasons I had a hard time getting too worked up: 1) the optimization affected a handful of games that nobody plays anymore, 2) identifying the quality difference required diffing the images—in other words, the changes were so slight that you had to use an application to compare the per-pixel differences, and 3) most important, AMD offered a check-box in its driver to disable Catalyst AI, the mechanism by which the surface format optimization was enabled.

Catalyst 10.10: Here's the option to disable Catalyst AI explicitly.Catalyst 10.10: Here's the option to disable Catalyst AI explicitly.

Now, you could say that the principle of changing the image from what the developer intended, in and of itself, is reason to take up arms against AMD. Indeed, altering image quality is a slippery slope, and what I might find inoffensive might immediately be noticeable to you. That just didn’t seem like a battle worth fighting, though.

And We’re Sliding…

But maybe it was, after all. In its most recent Catalyst 10.10e hotfix release, AMD removed the option to disable Catalyst AI. Instead, there’s now an option to turn that specific Surface Format Optimization on or off. The Catalyst AI texture filtering slider only gives the option of Performance, Quality, and High Quality, though, with no explanation of what’s being tuned. Without the option to turn off Catalyst AI altogether, I was less comfortable with optimizations AMD might make in the background for the sake of its performance (this works itself out in the end; keep reading).

What’s more, the change’s timing was suspect. I didn’t have time to dig too far into Nvidia’s recent accusation that AMD is tinkering with texture filtering quality in its newest drivers, mostly impacting the Radeon HD 6800-series cards. However, a number of German Web sites announced that they’d disable Catalyst AI entirely for testing Radeon HD 5800-series cards in order to achieve comparable image quality. And then the option to disable Catalyst AI disappears entirely? Things that make you go hmmm…

Catalyst 10.10e hotfix: The option to disable Cat AI isn't there, but High Quality does the same thing.Catalyst 10.10e hotfix: The option to disable Cat AI isn't there, but High Quality does the same thing.

Perhaps even that wouldn’t have been so bad. But then AMD holds a conference call with the German Web sites that rode its back over quality concerns first brought to light after Radeon HD 6850 and 6870 launched. Fortunately, translating German is something we do pretty damn well (special thanks to Benjamin Kraft at Tom’s Hardware DE).

The recounting of the meeting published by ht4u.net said that the new High Quality setting in Catalyst AI purportedly disables all optimizations, akin to what the Disable Catalyst AI option did in the past, while the Quality option strikes a balance between what AMD considers performance and image quality. In other words, the company says it is not purposely doing anything in software that would diminish image quality. We asked AMD for comment on the new slider, and it confirmed that setting High Quality indeed turns off optimizations, addressing the first concern we brought up.

At the same time, though, AMD acknowledged to the Germans an image quality issue that was causing the shimmering originally reported by those sites with the 6800-series cards, even with High Quality toggled. The conclusion the site drew was that it’d take a hardware revision to correct the issue. AMD tells us it can fix the problem via drivers, and that it’ll involve blurring the textures so they don’t shimmer.

Much Ado?

With all of that said, I went through seven different modern DirectX 11 titles looking for problematic areas that’d make for easy demonstrations, and, despite knowing about the issues and squinting at a 30” screen, came away with very little conclusive in the real world. These are subtle differences we’re talking about here.

Representatives at Nvidia did shoot over a handful of different videos comparing GeForce GTX 570 to Radeon HD 5870 and Radeon HD 6870 using Crysis Warhead. In them, the shimmering is painfully clear. Additionally, we were directed to a multiplayer map in Battlefield: Bad Company 2 that’d show it. But running around the single-player campaign didn’t turn up anything blatant.

Moving forward, I'm inclined to use AMD's High Quality Catalyst AI setting for graphics card reviews, if only to get the best possible filtering quality. Unfortunately, Nvidia is going to have to compete against the Quality slider position today, though, as it sent its GeForce GTX 570 with far too little time to retest everything we ran for the GeForce GTX 580 review less than a month ago. For the time being, we're going to have to leave everything at default and point out the observed and confirmed image quality issue currently affecting Radeon HD 6800-series cards. This may or may not become a factor in your buying decision, but right now, the bottom line is that Nvidia offers better texture filtering, regardless of whether you’re one of the folks who can appreciate it.

Why do we care if the difference is so nuanced? Because we have to take any deviation in image quality seriously—we don’t want to see Nvidia drop its own texture filtering default to help boost performance (yes, Nvidia has its own set of filtering-based quality settings it could tool with as well). Hopefully, that sort of tit-for-tat isn’t on the table in Santa Clara.

Display all 108 comments.
This thread is closed for comments
Top Comments
  • 32 Hide
    nevertell , December 7, 2010 11:31 AM
    Yay, I got highlighted !
  • 26 Hide
    sstym , December 7, 2010 11:36 AM
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
Other Comments
  • -9 Hide
    thearm , December 7, 2010 11:16 AM
    Grrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.
  • 5 Hide
    xurwin , December 7, 2010 11:30 AM
    at $350 beating the 6850 in xfire? i COULD say this would be a pretty good deal, but why no 6870 in xfire? but with a narrow margin and if you need cuda. this would be a pretty sweet deal, but i'd also wait for 6900's but for now. we have a winner?
  • 32 Hide
    nevertell , December 7, 2010 11:31 AM
    Yay, I got highlighted !
  • 5 Hide
    verrul , December 7, 2010 11:35 AM
    because 2 6850s is pretty equal in price to the 570
  • 26 Hide
    sstym , December 7, 2010 11:36 AM
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
  • 8 Hide
    nevertell , December 7, 2010 11:37 AM
    It's disappointing to see the freaky power/temperature parameters of the card when using two different displays. I was planing on using a display setup similar to that of the test, now I am in doubt.
  • -2 Hide
    reggieray , December 7, 2010 11:46 AM
    I always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.
    Or am I missing something?
  • 6 Hide
    reggieray , December 7, 2010 11:50 AM
    PS Excellent Review
  • 3 Hide
    theholylancer , December 7, 2010 11:50 AM
    hmmm more sexual innuendo today than usual, new GF there chris? :D 

    EDIT:

    Love this gem:
    Quote:

    Before we shift away from HAWX 2 and onto another bit of laboratory drama, let me just say that Ubisoft’s mechanism for playing this game is perhaps the most invasive I’ve ever seen. If you’re going to require your customers to log in to a service every time they play a game, at least make that service somewhat responsive. Waiting a minute to authenticate over a 24 Mb/s connection is ridiculous, as is waiting another 45 seconds once the game shuts down for a sync. Ubi’s own version of Steam, this is not.


    When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.
  • 2 Hide
    amk09 , December 7, 2010 11:52 AM
    nevertellYay, I got highlighted !


    So you gonna buy it? Huh huh huh?
  • -1 Hide
    nevertell , December 7, 2010 11:54 AM
    I was planning on doing so, but I didn't get enough money from the work I was doing, so I'll stick with just a new monitor. I will definitely get a new card during the next year, but not for now :(  And by then, there might be new great cards out there.
  • 1 Hide
    lostandwandering , December 7, 2010 11:55 AM
    Good looking performance numbers. Will be interesting to see what this does in the real world to NVidia's pricing of the GTX 400 series.
  • -5 Hide
    SininStyle , December 7, 2010 12:02 PM
    Wow, no sli 460s included? Yet you include 6850s in xfire? really? *facepalm* fail
  • 2 Hide
    darkchazz , December 7, 2010 12:04 PM
    now I want gtx 560
  • 3 Hide
    anacandor , December 7, 2010 12:07 PM
    While the 5xx series is looking decent so far, it seems to me (pure speculation here) that Nvidia held back with this series and are possibly putting more resources into Kepler. I feel this because they aren't trying to kill AMD for market share, instead put up a perfectly resonable product that neither EXCELS vastly beyond last gen, but providing enough performance to justify a new product. That said i'm looking forward to their 2011 lineup.

    Also, it would have been interesting to see Metro 2033 tested with max instead of medium settings. All the cards are able to play medium at all resolutions with no AA... push them to their limits? :) 

    Thoroughly enjoyable review though. Thanks, Chris!
  • 4 Hide
    gxpbecker , December 7, 2010 12:20 PM
    i LOVE seeing Nvidia and AMD trading blows back and forth. Keeps prices in check lol and gives more optiosn for buyers!!!
  • 1 Hide
    kg2010 , December 7, 2010 12:26 PM
    You can see how the 460's in SLI did here vs the 580
    http://benchmarkreviews.com/index.php?option=com_content&task=view&id=614&Itemid=72

    But yeah, this review NEEDS 460's 1GB in SLI to be fair, as they are definitely an alternative to a 580, even a 570. There are quite a few cards at or below $199

    Dual Hawks for $320 AFTER MIR:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127518

    And these cards will overclock well.
  • 0 Hide
    tronika , December 7, 2010 12:32 PM
    ReggieRayI always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.Or am I missing something?

    noticed that too. i really can't think of any reason other than the language support for the Tom's engineers. 99% of the gamer market would be better off with home premium 64 bit. the other 1% that actually runs and maintains a domain in their house should get professional or the bloated "ultimate". i mean, who really uses bitlocker on their gaming machine anyway? great article though! i jumped on the 580 and haven't looked back. used to run on a 5850 but now that i've seen fermi in all of it's glory i'm really interested in nvidia's future "vision".
  • 1 Hide
    cangelini , December 7, 2010 12:38 PM
    theholylancerhmmm more sexual innuendo today than usual, new GF there chris? EDIT:Love this gem:When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.


    Wait, innuendo? Where? :) 
Display more comments