Nvidia Points Finger at AMD's Image Quality Cheat
Image quality is a somewhat subjective thing, but benchmark results are not. When comparing graphics card performance, you want to try to get the most even test possible.
Nvidia's now pointing the finger at AMD saying that its competitor uses a different image quality setting that boosts benchmark results at the expense of image quality when compared to the GeForce parts.
Nvidia wrote in its nTersectblog:
NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.
FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.
I'm no fanboi and like whatever is best bang for the buck....which Nvidia has been winning lately (it seems)....but Nvidias deceptive business decisions/moves REALLY turn me off more than ANYTHING AMD can ever do.
It's just a desperate cry because nVidia lost a lot since Ati/AMD released 5xxx series
I'm no fanboi and like whatever is best bang for the buck....which Nvidia has been winning lately (it seems)....but Nvidias deceptive business decisions/moves REALLY turn me off more than ANYTHING AMD can ever do.
It's just a desperate cry because nVidia lost a lot since Ati/AMD released 5xxx series
And who in their right mind plays anything dx10 on a screen with less than 500 pixels on the side... And if you have the balls to buy a 68xx card, you most likely have a screen with more than 1000 side pixels...
Lol, whenever Nvidia throws something at AMD, AMD always seems to throw something harder at Nvidia... waiting to see the response
If you care about the end result, then you should care about this. It is essentially lying to reviewers such as tomshardware who test the products and give benchmark results to assist buyers in choosing which card to go with.
What AMD is doing is lying in benchmarks to appear better. So when the benchmark says you get 5 FPS more than an nvidia card you would actually be getting the same FPS in a real situation. But, if you are told that a certain card is better, which one are you going to buy? These image quality settings are only being done in benchmarks, which is straight up wrong.
So far they have been busted cheating on 3DMark benchmarks, AA / AF filtering, thermal ratings for their GPU's, game benchmarks, and pretty much anything else they can get away with.
If NVidia is whining then you can pretty much assume the following:
Their yeilds are low on their current line of GPU's and as a result profitability is down, and competitively they are being aced in a number of market niches.
These currently include:
Discrete low end
Chipsets (LOL)
Mid range (they don't have the same range currently available)
High end ... did I mention yields?
http://s0.2mdn.net/1592859/15secondInqWP.html?rfp=http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177-drivers
http://www.geek.com/articles/games/nvidia-still-cheating-even-with-latest-3dmark-build-2003069/
http://www.nvnews.net/vbulletin/showthread.php?t=11826
He called me a name!
I have been a very content Nvidia owner in the past, but their PR is reaching Repblican Party election campaign lows.
Quite untrue. They actually relate to real-world improvements in FPS performance when running a program in a window. It's actually quite important for some people that cannot run full-screen due to other processes going on. That testers don't specify that it's "artifically" improved windowed performance is were the real mistake is, they should run both types of tests.
Because benchmarking always tells the whole story...
Dude, it's been ages since I've seen a review talking about image quality. AGES. Tom's hardly does it nowadays (remember those gifs and high res jpegs comparing screenshots?).
Anyway, like yebornah said, CCC has a slider for texture and mipmapping quality (was it mipmapping? XD). I like it so much against nVidia's complicated way of doing things. Have you seen the profiling in nVidia's drivers? Gosh! KISS nVidia, KISS.
Cheers!
Its like saying look our car does average 120mhp while the competition only average 110mhp... not telling you run your test on a straight road while the competitions car had to run at a mixed road with extra turns.
This IS wrong - Negative one point to Ati for this!
Meh, whatever...both nVidia and ATI/AMD are making kick-a$s gpu's nowadays and not sure why nVidia is resorting to these attack tactics.