ATI caught cheating in Crysis to outperform GTX480

From what I'm getting from google-translate, I think what the article is saying is that ATI is using a jerry-rigged version of AA/MSAA/etc in Crysis, while the GTX480 is using the actual AA that looks much better... ?
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
If there is any truth to the claim, the news will make its way around. Nvidia would make sure it did for their own sake. I have my doubts though.
 

Or they are just sitting back and giggling their socks off at the concept of the competition having to skew the results when faced with what is effectively the "broken" version of the GF100's. [:mousemonkey]
 

notty22

Distinguished


Exactly, if you look at Anands launch review of the 5870 and the new #'s generated when compared with the Gtx 480, many are within a frame or 3. And thats 6 revisions ! They are able to say (without lying) a 5 or 7% in Driver notes, increase as long as there is that at some unspecified resolution and quality setting. The next month , they tweek for less AA and a higher resolution and they magically can claim a new 5% increase !. I'm not pointing fingers at either side, but if you added up these %'s in every driver release the number would not make sense.
 




:lol: That's why I love taking a poke at that little mantra whenever it gets trotted out, which (in a spooky kind of way) seems to coincide with every release of a new card from ATi.
 

randomizer

Champion
Moderator
I think the issue is more with the AF than the AA. It seems that there's some sort of bilinear filtering being used from the images nearer the bottom. But yea, translators don't do much for me on this page so I didn't link it to one. There's some discussion on it at XS as well, which is where I got the link from: http://www.xtremesystems.org/forums/showthread.php?t=248755

I am not sure why people say they can't see a difference. It's small but it is there, and it will have some impact on framerate.
 

notty22

Distinguished
It takes some very 'smart' and in the know people to catch this stuff thats why.
Long story short , it allows the gpu to work less, increasing performance. Best explanation seems like this post
http://www.xtremesystems.org/forums/showpost.php?p=4320571&postcount=42
For all the people saying there is no difference so its not an issue, if nvidia did this everyone would go on and on about cheating...

bilinear filtering does 5 texel lookups per pixel, trilinear does 10 and then does a linear interpolation between 2 colors, anisotropic uses the view angle to vary the sampling range of the texels, each anisotropic level enlarges the sampling range(non-uniformly) and so the texels sampled increase, it then applies trilinear filtering with the new ranges. This allows the textures to look smooth even when moving through a scene and as a result heavily anti-aliases textures and reduces pixel swimming but comes at a heavy cost.

For example in a standard 1680x1050 scene, the texel lookups counts are as follows:

bilinear: ~8.82 million texel lookups
trilinear: ~17.64 million texel lookups
Anisotropic: (depending on AF level) potentially more than 4x the texels of tri-linear depending on the AF level.


In a still image there is no problem, the major filtering artifacts are movement based artifacts such moire patterns and pixel swimming and mip-level pops, static images will never really show the full picture when it comes to filtering. I cant believe ATI would do this!

I'd would love someone used the 10.3s and the previous versions and see if there is a difference, that nice 10% increase across the board the new drivers bring might be
 

TRENDING THREADS