Sign in with
Sign up | Sign in
Your question

Apparently Nvidia is cheating...a little

Last response: in Graphics & Displays
Share
January 3, 2008 3:25:57 AM

http://www.pcgameshardware.de/?article_id=626622

28.12.2007 09:10 Uhr - According to current investigations from PCGH, Nvidia's drivers don't always render what you expect them to do. We compared older and brand-new drivers with a Geforce 7 and 8, using Windows XP x86, Vista x86 and x64.


Quote:
The Control Panel allows the user to disable all the "optimizations” concerning the anisotropic filter - but now it's obvious that these options only affect Direct 3D and not Open GL. Disabling the "optimizations” and setting "High Quality” still results in shimmering textures in our tested games (American McGee's Alice and Prey). The Riva Tuner shows why: the OGL filtering restrictions are still active, as shown on the pictures beneath.

After deeper testing we came to the conclusion that the Forceware 93.71 WHQL is the last official driver whose Control Panel affects Open GL. Starting from version 101.09 (unofficial beta) up to 169.2x the "optimizations” stay alive, no matter what's set in the CP. If you want your Geforce to filter properly, you have to use 3rd-party tools like the Riva Tuner or Nhancer.

Nvidia is informed, we're currently waiting for a statement.




I can't confirm how much this affects frame rates, but to some extent it must help with even a few FPS in benchmarks. It was odd to notice that they tested these results using American McGhee's Alice. I wonder how an updated OpenGL engine would be affected.

More about : apparently nvidia cheating

January 3, 2008 3:31:07 AM

I'd also like to add that this gives some reassurance to those who have been flamed for their "feelings" that AMD seemed to be offering better visuals.

Maybe it does bring about a question as to Nvidia's dominance within OpenGL.

But honestly, I use an 8800GTS and I feel any degradation in quality is due mostly to the fact that I'm using an NEC with an IP-S panel vs. my old NEC cathode.
January 3, 2008 3:35:55 AM

My apologies...I was so focused on the first few pictures they supplied.


They also have comparable photos utilizing Prey.
Related resources
January 5, 2008 11:18:15 PM

I guess nobody cares these days :) 

Seemed like a hot topic not so long ago.
January 6, 2008 12:01:09 AM

i knew something was wrong with nvidia...if its too good to be true it probably is!
January 6, 2008 12:44:53 AM

This isn't that big of a deal, and isn't by any means new information. The only thing that is different is that they comfirmed its is still happening in newer drivers. Their "optimizatons" are nothing more than an attempt to get the best performance at the sacrifice of quality. ATI has done the same thing with Catalyst AI.
I remember this back when 3Dmark 2001 was still big. Nvidia "optimized" their drivers to run 3Dmark 01. Good concept if it did not make false scores. With a TI4200 you could get numbers comparable to a Radeon 9800Xt.
I don't much care. If it looks good and plays good then why bother? Who cares how they did it.
January 6, 2008 1:03:28 AM

Certainly, it isn't anything new.


However, there were several forum members discussing whether or not these allegations were still credible. As it stands, some forum members, who I witnessed being flamed for stating that ATI seemed to be doing a bit better atm, were correct.



"Best performance at the sacrifice of quality"


That's why the 8800 isn't a Quadro part. That's exactly what the consumer segment of graphics cards amounts to; how many corners can they(ATI or Nvidia) cut within the OpenGL and Direct3D api.
January 6, 2008 1:12:20 AM

As long as these "cuts" aren't largely noticeable, I'm fine with that.
January 6, 2008 1:45:48 AM

heh, nvidia... that's not surprising to say the least

from my take, they are doing this because most PC games are DX based and OGL isn't as popular

I still hate nvidia for removing stereovision support (3D vision) from the 8xxx line
grrr.... they don't understand my 7950GT is struggling
January 6, 2008 1:50:00 AM

haha...remember those...crap I cant' remember their name.

Elsa Razor? that came with 3D Glasses?


I didn't realize anybody had 3D glasses anymore.
January 6, 2008 1:52:06 AM

I vote for OGL over D3D any day.

Game engines used to be programmed quite well for the OGL API even tho it really was never meant for gaming. But it still gave superior images imo.
a b Î Nvidia
January 6, 2008 2:15:08 AM

Both companies do this....

99% of the time the differences are not noticable(but i do see it in prey).
!