Nvidia Points Finger at AMD's Image Quality Cheat

Image quality is a somewhat subjective thing, but benchmark results are not. When comparing graphics card performance, you want to try to get the most even test possible.

Nvidia's now pointing the finger at AMD saying that its competitor uses a different image quality setting that boosts benchmark results at the expense of image quality when compared to the GeForce parts.

Nvidia wrote in its nTersectblog:

NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.

FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default.  Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing. 

A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.  During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”.  Special-casing of testing tools should also be considered a “cheat”.

Read the full post here.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
160 comments
    Your comment
    Top Comments
  • chaoski
    And Nvidia pays MILLIONS to developers to make their cards perform better in games....until the game comes out and AMD update drivers.

    I'm no fanboi and like whatever is best bang for the buck....which Nvidia has been winning lately (it seems)....but Nvidias deceptive business decisions/moves REALLY turn me off more than ANYTHING AMD can ever do.
    32
  • Anonymous
    Anti-Aliasing QC by nVidia is the biggest cheat. It reduces aliasing but makes the textures way blurrier.
    It's just a desperate cry because nVidia lost a lot since Ati/AMD released 5xxx series
    32
  • anacandor
    Nvidia should be spending their time improving their own products, not delving into AMD's "cheat" drivers. Nvidia is turning the GPU market into some pre election ad campaign...
    32
  • Other Comments
  • chaoski
    And Nvidia pays MILLIONS to developers to make their cards perform better in games....until the game comes out and AMD update drivers.

    I'm no fanboi and like whatever is best bang for the buck....which Nvidia has been winning lately (it seems)....but Nvidias deceptive business decisions/moves REALLY turn me off more than ANYTHING AMD can ever do.
    32
  • Stifle
    Haters gonna hate.
    20
  • fstarnella
    Who cares?? I just care about the end result. Whatever happens under the hood is fine by me.
    -23