Apparently Nvidia is cheating...a little

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
http://www.pcgameshardware.de/?article_id=626622

28.12.2007 09:10 Uhr - According to current investigations from PCGH, Nvidia's drivers don't always render what you expect them to do. We compared older and brand-new drivers with a Geforce 7 and 8, using Windows XP x86, Vista x86 and x64.


The Control Panel allows the user to disable all the "optimizations” concerning the anisotropic filter - but now it's obvious that these options only affect Direct 3D and not Open GL. Disabling the "optimizations” and setting "High Quality” still results in shimmering textures in our tested games (American McGee's Alice and Prey). The Riva Tuner shows why: the OGL filtering restrictions are still active, as shown on the pictures beneath.

After deeper testing we came to the conclusion that the Forceware 93.71 WHQL is the last official driver whose Control Panel affects Open GL. Starting from version 101.09 (unofficial beta) up to 169.2x the "optimizations” stay alive, no matter what's set in the CP. If you want your Geforce to filter properly, you have to use 3rd-party tools like the Riva Tuner or Nhancer.

Nvidia is informed, we're currently waiting for a statement.



I can't confirm how much this affects frame rates, but to some extent it must help with even a few FPS in benchmarks. It was odd to notice that they tested these results using American McGhee's Alice. I wonder how an updated OpenGL engine would be affected.
 

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
I'd also like to add that this gives some reassurance to those who have been flamed for their "feelings" that AMD seemed to be offering better visuals.

Maybe it does bring about a question as to Nvidia's dominance within OpenGL.

But honestly, I use an 8800GTS and I feel any degradation in quality is due mostly to the fact that I'm using an NEC with an IP-S panel vs. my old NEC cathode.
 

F34R1355

Distinguished
Sep 9, 2007
101
0
18,690
This isn't that big of a deal, and isn't by any means new information. The only thing that is different is that they comfirmed its is still happening in newer drivers. Their "optimizatons" are nothing more than an attempt to get the best performance at the sacrifice of quality. ATI has done the same thing with Catalyst AI.
I remember this back when 3Dmark 2001 was still big. Nvidia "optimized" their drivers to run 3Dmark 01. Good concept if it did not make false scores. With a TI4200 you could get numbers comparable to a Radeon 9800Xt.
I don't much care. If it looks good and plays good then why bother? Who cares how they did it.
 

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
Certainly, it isn't anything new.


However, there were several forum members discussing whether or not these allegations were still credible. As it stands, some forum members, who I witnessed being flamed for stating that ATI seemed to be doing a bit better atm, were correct.



"Best performance at the sacrifice of quality"


That's why the 8800 isn't a Quadro part. That's exactly what the consumer segment of graphics cards amounts to; how many corners can they(ATI or Nvidia) cut within the OpenGL and Direct3D api.
 

kolix

Distinguished
Nov 26, 2006
56
0
18,630
heh, nvidia... that's not surprising to say the least

from my take, they are doing this because most PC games are DX based and OGL isn't as popular

I still hate nvidia for removing stereovision support (3D vision) from the 8xxx line
grrr.... they don't understand my 7950GT is struggling
 

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
haha...remember those...crap I cant' remember their name.

Elsa Razor? that came with 3D Glasses?


I didn't realize anybody had 3D glasses anymore.
 

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
I vote for OGL over D3D any day.

Game engines used to be programmed quite well for the OGL API even tho it really was never meant for gaming. But it still gave superior images imo.