Sign in with
Sign up | Sign in
Your question

Will GTX570 be caught in new app detection cheat?

Last response: in Graphics & Displays
Share
a c 235 U Graphics card
a b Î Nvidia
November 28, 2010 4:54:29 AM

more fuel to the ATI/Nvidia fire

"Over the years, there have been many famous ‘app detection’ revelations. Probably the most famous was revealed by Beyond3D when genius uber-geeks discovered that changing the name of the 3DMark 2003 executable file created a huge difference for the performance of nVidia cards. Looking at recent documents posted into the KitGuru forum, we have another cause for investigation on our hands. If it’s a lie, then it is a very clever one, and we will work hard to find out who perpetrated it. If it’s the truth, then it’s a very worrying development. KitGuru powers up a robotic Watson and takes it for a walk on the Image Quality moors to see if we can uncover any truth implicating a modern-day Moriarty (of either persuasion)."

http://www.kitguru.net/components/graphic-cards/jules/w...
a c 171 U Graphics card
a b Î Nvidia
November 28, 2010 6:13:03 AM

Did you read the comments? Not sure if thats whats really going on, but there probably isn't much true to this story.
a b U Graphics card
November 28, 2010 6:58:17 AM

whether it's nvidia or amd, if the only way to show the difference in image quality is by magnifying the crap out of a still image (or the use of some sort of AF measuring software), it won't bother me. unless the difference is pretty obvious.

whether you like it or not, drivers detects game executables for optimization. i've never been a fan of "changing exe to this and that blah blah".
Anonymous
a b U Graphics card
November 28, 2010 7:43:47 AM

KitGuru got an official reply from nVidia to say that the screen grabs and performance increases are accurate and the result of an application detection, but the full story is more interesting. Not sure if you can post URLs here so I'll just say that the official reply from nVidia's Nick Stam is on the KitGuru.net home page
a c 171 U Graphics card
a b Î Nvidia
November 28, 2010 8:25:12 AM

Quote:
if the only way to show the difference in image quality is by magnifying the crap out of a still image (or the use of some sort of AF measuring software
), it won't bother me.


The problem comes from games that aren't optimized. If you buy card X over card Y because at the settings you use card X gets 40FPS while card Y gets 35 then you made a good buy. But if card X has to use special settings that don't match card Y, and is infact the slower card, then you have an issue when playing a game that the drivers don't know. (like a new one) In this case card X might get only 30, and now card Y is the better deal. (FPS > 30.)

We have had this kind of thing happen before. Its never a good thing for the consumer. If you're going to be playing at X setting, then you need to have X settings happen. Not X.5, not Y. Both sides need to do this.
a b U Graphics card
November 28, 2010 10:10:25 AM

The problem comes from games that aren't optimized. If you buy card X over card Y because at the settings you use card X gets 40FPS while card Y gets 35 then you made a good buy. But if card X has to use special settings that don't match card Y, and is infact the slower card, then you have an issue when playing a game that the drivers don't know. (like a new one) In this case card X might get only 30, and now card Y is the better deal. (FPS > 30.) said:
The problem comes from games that aren't optimized. If you buy card X over card Y because at the settings you use card X gets 40FPS while card Y gets 35 then you made a good buy. But if card X has to use special settings that don't match card Y, and is infact the slower card, then you have an issue when playing a game that the drivers don't know. (like a new one) In this case card X might get only 30, and now card Y is the better deal. (FPS > 30.)


there will always be subtle differences between 2 different cards on how an image is rendered.

i had a 4870 before i got my old gtx260, imagine the outrage i caused when i pinpoint the badly filtered textures in crysis in this very same message boards, when i criticized my 4870. the stuttering and all. but there are games where it shone against my gtx260. especially racing games, ehem, race driver grid, ehem.

in any case, some games will do better with nvidia, some will do better with ati. this incident as well as the amd/ati one is nothing but fanboy epeen getting bent. nvidia just took the opportunity to burn amd, and i'm waiting on how amd will react to this one.

and in the instance i decide to buy hawx 2 or fire up the old hawx, i'll make sure to up the AA ante a little bit, even though its pointless as 4xAA @ 1080p removes aliasing artifacts almost completely regardless of brand. .
a c 171 U Graphics card
a b Î Nvidia
November 28, 2010 12:08:30 PM

I'm not talking about when a game engine prefers/performs better on one arch compared to the other. I'm talking about when a company "tweaks" (I believe Nvidia prefers the term "optimizes") its driver so that a game performs better then it should. While the most famous of this might be Nvidia with the 3DMark 03 program, its happened before. Here is an article about Nvidia doing it with Crysis.

http://www.elitebastards.com/index.php?option=com_conte...

Its not good when either company does it. And as that article shows, its not always the case the you have to blow something up to 20x its normal size.
a c 235 U Graphics card
a b Î Nvidia
November 28, 2010 12:48:06 PM

4745454b said:
Did you read the comments? Not sure if thats whats really going on, but there probably isn't much true to this story.


i posted it more as an FYI for people's curiosity, not as OMG look what Nvidia may be up to!

here was the reply from Nvidia copied from kit's comment section, seems to be a bug with the game


Hi Everybody,

What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

Nick Stam, NVIDIA
a c 171 U Graphics card
a b Î Nvidia
November 28, 2010 5:07:21 PM

I read that, but not totally sure I believe it. Nvidia wants us to believe that HawX by default picks the highest level of AA? Can anyone name any other game that auto applies the highest level of AA?

Anyone?
a b U Graphics card
November 28, 2010 5:22:28 PM

The question then becomes: Does this happen with AMD cards? If so, then that fix WILL mean that the nVidia card is cheating. Also have we seen the developers comment on this bug?

That said, I actually don't care all that much.
a c 173 U Graphics card
a b Î Nvidia
November 28, 2010 5:46:53 PM

I only complain when games are either extremely slow while others are just fine or are unstable meaning they crash. Over all if it is a big problem then just keep one from each brand on hand <.<

To get back to this topic this isn't new if they are cheating.
a b U Graphics card
November 28, 2010 7:50:39 PM

Whoa, I had no idea this company's could go that way.........nobody lies you know?"insert sarcasm"

It doesn't seem a big deal for end users though.
a c 171 U Graphics card
a b Î Nvidia
November 28, 2010 11:41:47 PM

Quote:
Whoa, I had no idea this company's could go that way.


Seriously? You must be very new to computers then. Both companies have a long history of doing whatever they can to seem better then the other. The fanboy in me says Nvidia is worse about doing things.

Quote:
It doesn't seem a big deal for end users though.


And as long as you sheeple think that way the companies will continue to do what they can.
a c 173 U Graphics card
a b Î Nvidia
November 29, 2010 12:00:46 AM

4745454b said:
Quote:
Whoa, I had no idea this company's could go that way.


Seriously? You must be very new to computers then. Both companies have a long history of doing whatever they can to seem better then the other. The fanboy in me says Nvidia is worse about doing things.

Quote:
It doesn't seem a big deal for end users though.


And as long as you sheeple think that way the companies will continue to do what they can.



I think the same way about governments around the world, if people would for one moment get their heads out of their asses and stopped watching TV they would begin to notice what has been going on for years in plane sight. Take notice of the wiki leaks and how the media is already starting to quiet down about North Korea even though Japan, China, and the US is getting sucked into the whole mess let alone the potential of another war that will sink the whole system. CTD, circling the drain.
a b U Graphics card
November 29, 2010 1:24:42 AM

4745454b said:
Quote:
Whoa, I had no idea this company's could go that way.


Seriously? You must be very new to computers then. Both companies have a long history of doing whatever they can to seem better then the other. The fanboy in me says Nvidia is worse about doing things.

Quote:
It doesn't seem a big deal for end users though.


And as long as you sheeple think that way the companies will continue to do what they can.


It seems I have misplaced the "insert sarcasm", it was meant for the whole "Whoa, I had no idea this company's could go that way."

And as of the last one, I still think is not a big deal for end users, simply because I can't see a direct impact on my wallet, still, this is only what 1 person thinks.
a c 171 U Graphics card
a b Î Nvidia
November 29, 2010 2:41:59 AM

First, you were lied to. Second, its an impact on your wallet because you (possibly) bought the wrong card. Assuming you want the best bang for your buck, you need to buy the best performing card for you. If you look at benchmarks that artificially inflate scores because the drivers know your running benchmarkgame.exe and you buy the "wrong" card then you didn't get a good deal. Sometimes you can't even notice the errors, but as the link I provided shows sometimes you can.
a b U Graphics card
November 29, 2010 3:03:25 AM

Best bang for buck has almost always been ATI, and bought accord to that. And as I stated, I don't see any holes in my wallet, the card is exactly what I expected it to be at the exact same price.
a c 235 U Graphics card
a b Î Nvidia
November 29, 2010 4:03:55 AM

kiban said:
Best bang for buck has almost always been ATI,


...not really. they had some crappy series even against Nvidia's high prices.
a c 171 U Graphics card
a b Î Nvidia
November 29, 2010 4:16:19 AM

I can see you're not going to understand what I'm trying to say. Oh well, can lead but can't make you drink.
a b U Graphics card
November 29, 2010 5:19:12 PM

I do understand it, that's why I can say it's no big deal for me. I'm aware of what you are saying, and still...
Why? well, even if that "fact" didn't filtered on the news, we would be happy with our ignorance and with our cards. Now that you learnt, that what we all know, happen in the shadows, you get angry? the first step would be not to buy if you do understand and not agree with it.

I do understand the company's trickery and I'm full aware that something somewhere may not be what it seems, still I bought the card, do you understand me now?
a c 171 U Graphics card
a b Î Nvidia
November 30, 2010 12:15:04 AM

Quote:
I do understand it, that's why I can say it's no big deal for me.


Not as long as you write things like that.
a b U Graphics card
a b Î Nvidia
November 30, 2010 12:35:49 AM

Let me know when AMD responds to being called out by Nvidia for cheating and dragging IQ in to the gutter.
Its hilarious 'some' turn AMD cheating in to a Nvidia issue /oh the drama lol
http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia...
Quote:
We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.


a b U Graphics card
November 30, 2010 1:26:59 AM

Eh, the difference is that your example is purely asinine. AMD gets better performance by reducing the image quality by a completely unnoticeable amount, even in stills, is not cheating, but more a feature in my book. I think the term is called optimization.

nVidia may be doing the same, but this time it actually changes the image quality slightly and tricks reviewers.

Note that I don't think that this is very major and I'm not exactly angry over it, but it is something to be considered by reviewers since it has a little bit of ground to stand on, while your example really doesn't.
a b U Graphics card
a b Î Nvidia
November 30, 2010 1:39:52 AM

AMW1011 said:
Eh, the difference is that your example is purely asinine. AMD gets better performance by reducing the image quality by a completely unnoticeable amount, even in stills, is not cheating, but more a feature in my book. I think the term is called optimization.

nVidia may be doing the same, but this time it actually changes the image quality slightly and tricks reviewers.

Note that I don't think that this is very major and I'm not exactly angry over it, but it is something to be considered by reviewers since it has a little bit of ground to stand on, while your example really doesn't.

Whats asinine is noobs going on about stills, when every article clearly tells YOU , who bothers to read, you can't see it (IQ degradation) in stills, it shows up as shimmering.
They gave you the word optimization in their blog posting, and they clearly tell you why its cheating. Cheating by ATI's own definitions. When they reduce the amount of work the gpu does at the cost of IQ loss.


Quote:
Filter Tester Observations
Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its “ground2” texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the “perfect” software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the “texture movement” settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.

AF Tester Observations
ComputerBase also says that AMD drivers appear to treat games differently than the popular “AF Tester” (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.

NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.
a b U Graphics card
November 30, 2010 2:14:41 AM

Eh, the difference is that your example is purely asinine. [b said:
AMD gets better performance by reducing the image quality by a completely unnoticeable amount, even in stills, is not cheating, but more a feature in my book. I think the term is called optimization.]Eh, the difference is that your example is purely asinine. AMD gets better performance by reducing the image quality by a completely unnoticeable amount, even in stills, is not cheating, but more a feature in my book. I think the term is called optimization.
[/b]

i believe so too, henceforth, NVIDIA's 32xAA is somewhat retarded and pointless.
a b U Graphics card
November 30, 2010 6:51:39 AM

notty22 said:
Whats asinine is noobs going on about stills, when every article clearly tells YOU , who bothers to read, you can't see it (IQ degradation) in stills, it shows up as shimmering.
They gave you the word optimization in their blog posting, and they clearly tell you why its cheating. Cheating by ATI's own definitions. When they reduce the amount of work the gpu does at the cost of IQ loss.


I've seen the comparisons, and I've yet to see a difference. I have yet to hear of anyone, save you, that sees the difference. Where are AMD dragging the IQ into the gutter? Furthermore, what kind of biased, slimy comment is that anyway? Even nVidia's actually tangible "cheating", if you can even call it that, is difficult to detect.

Quote:
So the difference at default driver setting in-between AMD and NVIDIA is as far as we are concerned NIL.


http://www.guru3d.com/article/radeon-hd-6850-6870-revie...

But you know better right?

As for what ATI has said, as far as I can tell it makes no real IQ difference, therefor its a pass. If for some reason it doesn't quite fit their criteria, then congratulations, because their both hypocrites, again.
!