nVidia duping consumers again?

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
I was wondering why my texture quality was so poor in FS 2004 (Microsoft's Flight Simulator 2004) on my nVidia 7800GTX 512MB card vs. my older ATI X800XT PE 256MB card. Increasing the Mipmapping quality in FS 2004 only made matters worse not better (shimmering).

I have two 7800GTX 512MB (one MSI and one eVGA) and SLI or not makes no difference in the AF quality -- something is up and it don't feel good.

So I poked around and found these articles on nVidia going back to "duping" the consumers again:

http://www.theinquirer.net/?article=25807

http://www.3dcenter.org/artikel/g70_flimmern/index2_e.php

Since I don't like to judge just on a couple of sources, can anyone else confirm that nVidia are back at their old game of producing drivers that reduce image quality to improve performance?

Is AF16X on the nVidia 7800GTX really only equal to AF2X on the ATI X800XT PE??

I'm going to try to run some of my own tests to see if I can discover a potential issue with nVidia. If indeed they are over optimizing their AF values in order to gain better frame rates/scores, then my $1600 worth of video cards will be returned.

Rob.
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
If indeed they are over optimizing their AF values in order to gain better frame rates/scores, then my $1600 worth of video cards will be returned.

Then return them, but i dont know what you're going to buy.....ATI has simaliar optimizations for 'scores'. WHy do yout hink the drivers that come out 5months after the card does do so much betterin 3dmark05 and other games? Its all about writing drivers to increase the framerates withou screweing up the game.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
It took some tweaking but I was able to make it work well in FS2004.

Turning OFF optimizations has dramatically helped image quality but has cut my frame rates in 1/2. I was able to get some frame stutters from COD2 maxed out at 1920 x 1200 even in SLI mode.

I wonder if ATI X1800XT crossfire can do any better in terms of quality AND performance?

Anyway, it seems benchmarking will have to be changed once again to eliminate the driver tricks -- maybe DX10 will resolve this once and for all. The 3DMark05 performance profile provided by nVidia is a joke -- clearly trys to dupe the benchmark results.

Adjust the driver profile to get reasonable image quality and the 3DMark05 results go to crap on these 7800GTX 512MB cards -- no idea why all these people reviewing these cards didn't bother to mention that??

I'll have to checkout people reviewing the ATI X1800XT crossfire and see if they really have a clue in their testing process. These 7800GTX 512MB are good, but no where near as good as the claims even in SLI and many existing games still have problems with SLI mode.
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
You make baby jesus cry.....

First you bitch about Nvidia optimiing there product to get higher 3dmark scores....then you bitch that your scores arn't high enough.

I am going to break it down for you....like i said before....they optimize the cards to get the bes framerate without messing with the quality. Yes sometimes you get better quality by turning off the optimizations but your cards will slow down...thats how it works. Its your choice.

Who cares about 3dmark scores? yeah they give a general idea about how fast cards are....but they dont actually do anything. People who tweak and mo can get scores of 10k+ with a single GTX....some even to 12K+...the scores mean nothing.


ATI Crossfire isn't a real object yet....find me a master card for a 1800XT, find me a master card for a X1800XL, or ANY card other then the X850. I haven't seen one yet, but i guess they could be out there somewhere.....


Both ATI and Nvida optimize their cards for 3dmark and all games....they put out drivers all the time that do this. Like it or not thats how it works. And you just gave Nvidia a $1600 approval of their tactics....lol


Benchmarking is for bragging rights.....DX10 wont stop it, nothing will....just play your games and quit worrying about benchmarks.


Does your car go 0-60 in the time it says it can? NO. this happens everywhere
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
3DMark05 are meaningful when used to see what overclocking can do, what memory timing can do, what the various video card settings can do -- all on the same box. It is a great tool to understand the implications of the many settings available with most video cards.

3DMark05 as a comparision tool with other PCs is less meaningful when there is not enough information provided about quality settings.

Actually, getting WHQ certification should require visual image quality testing when used for video card drivers. This is where Microsoft need to step in and this is also where DX10 can help control what a manufacturer (like nVidia or ATI) can "optimize".

I don't have a car, I have a truck and yes it will do 0-60 in the time it says, very consistantly. If people knew how to drive they too could also be consistant 0-60 -- but that is a different topic and you analogy doesn't translate well.
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
But if they have the opty's then within the same box they should act the same....and therfor you should be able to see what your OC's have done.

but again they are meaningless numbers....If you get an extra fps you will NEVER NOTICE, If you really want to know what your OC does then build your own.


Your car/truck going 0-60 is a perfect example and you pointed out why yourself....if people learned how to do it, they could get those same 0-60 times. Same with you, if you learned how to optimize your settings and tweak you could get some meaningless numbers too.

0-60 times are a perfect example for 3dmark05 scores.....nice to brag about but mean nothing in the real world of driving on roads that arn't 100% straight and in games that require a differnet mix of system resources then that benchmark.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
0-60 times in my car/truck vs. magazine posted times vs. manufacturers posted times -- they are all identical -- why? Because it is a heavy vehicle going thru an automatic transmission where the human factor, environmental factors, etc. etc. can't expose the variance to a significant value.

Toss in a clutch and shifting then the 0-60 times start to vary because of the human factor. There is no introduced human factor when it comes to video card performance and hence your analogy is NOT a good one.

Testing performance directly in a game (unless you run a demo script) is difficult because it is hard to reproduce the exact same scenario. So testing using 3DMark05 (or other tools) is an easier approach to determine perforamnce changes. Overclocking CPU, Memory, Video card and/or chaning timing can produce significant changes in frame rates especially at 1920 x 1200. Push the envelope and the graphics card settings start to become very significant (more than just one or two fps).

As far as comparing 3DMark05 to other systems, it is good for ballpark type testing -- i.e. if 10 systems with similar video card/processor are all getting 10500 score and my similar system is getting 7000, then that can be used to indicate a possible problem.
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
All the 3DMark 05 scores i see for SLI'd 512's are from 12-13K stock....


Have you read any reviews of the 7800GTX 512? almost everyone of the ones i read had image quality sections.


I just find it so interesting that you care so much about optimizing drivers for higher scores when both ATI and Nvidia have both done it. They are cinstantly putting out new drivers that get higher and higher scores on 3dmark05....with the same settigns and hardware.....they are finding new ways to manipulate code and take advantage of your resources, and thats what they SHOULD be doing.

What is in your rig? the full system? The graphic shimmering is most likely a Nvidia issue, but if you are having slowdowns it could be other things. (i doubt it because of trhe money you spent on the GPU's)
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
Some of the reviews I read:
http://www.beyond3d.com/previews/nvidia/78512/
http://www.anandtech.com/video/showdoc.aspx?i=2607
http://graphics.tomshardware.com/graphic/20051114/

They cover basic AA, AF settings -- they don't talk about any of these settings:

Force Mipmaps
Conformant Texture clamp
Hardware acceleration (single display only)
Trilinear Optimization
Anisotropic mip filter optimization
Anisotropic sample optimization
Gamma correct antialiasing
Transparency antialiasing
Triple buffering
SLI rendering mode
Negative LOD Bias

For example, turn ON Transparency antialiasing and you get an immediate 33% frame rate hit.

My system is:
Asus A8N-SLI Premium
2GB Corsair
one MSI 7800GTX 512MB
one eVGA 7800GTX 512MB
AMD FX57
Creative Lab xFi Elite Pro
Dell 24" LCD
SilverStone Zeus 650 Watt SLI power supply
Zalman 9500 series CPU cooler 4 case fans (Thermaltake)
Plextor 716A SATA DVD-R/RW
two WD Raptor 74GB SATA 10k rpm drives

This system I opted out from water cooling -- that's my other rig.
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
I know the FX57 is faster for games right now but the X2's will soon take over....so that might give your GPUS some extra headroom performance wise when the cpu can keep up...lol

I dont know i have t jet off to work, i will look for a more indepth review when i get home.
 

sheridan76

Distinguished
Dec 7, 2005
4
0
18,510
I was wondering why my texture quality was so poor in FS 2004 (Microsoft's Flight Simulator 2004) on my nVidia 7800GTX 512MB card vs. my older ATI X800XT PE 256MB card. Increasing the Mipmapping quality in FS 2004 only made matters worse not better (shimmering).

I have two 7800GTX 512MB (one MSI and one eVGA) and SLI or not makes no difference in the AF quality -- something is up and it don't feel good.

So I poked around and found these articles on nVidia going back to "duping" the consumers again:

http://www.theinquirer.net/?article=25807

http://www.3dcenter.org/artikel/g70_flimmern/index2_e.php

Since I don't like to judge just on a couple of sources, can anyone else confirm that nVidia are back at their old game of producing drivers that reduce image quality to improve performance?

Is AF16X on the nVidia 7800GTX really only equal to AF2X on the ATI X800XT PE??

I'm going to try to run some of my own tests to see if I can discover a potential issue with nVidia. If indeed they are over optimizing their AF values in order to gain better frame rates/scores, then my $1600 worth of video cards will be returned.

Rob.

Just as an FYI - I found several articles that debiinked the NVIDIA conspiracy/cheapAF/Shimmering as being a known/fixed driver issue. They do indicate that the filtering may be a bit cheaper, but generally speaking, the super harsh critiques were not justified. In fact, the third article indicates that the SAME type of problem showed up in ATI cards.

http://www.bit-tech.net/news/2005/09/01/g70_texture_shimmer_fix/
http://www.bit-tech.net/news/2005/08/26/g70_texture_shimmer/
http://www.hardwareanalysis.com/content/article/1812/ (ATI shimmering)

Grass is always greener.
Sheridan
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
Yeah, I think you're correct on the shimmering issues -- some games don't exhibit the problem at all while other games seem to have more problems with it. I suspect it is a combination of coding to a nVidia specification to "resolve" the issue and work with nVidia drivers in a more efficient manner.

I'm sure the developers will have to jump thru the same hoops for ATI drivers. I'm sure both stick to the DX spec but I know in development there are many ways to work within a spec to produce the best results.

I'm sure FS2004 was probably developed on 2002/2003 hardware of the time (and it worked well on that hardware), however it seems the marriage of code and drivers didn't stand the test of time. I can't wait for FS2006 and I'm sure it will be jaw dropping good with ALL these issues resolved.

As far as X2, we shall see, I'm definitely NOT convinenced X2 is nothing more than marketing hype and more money for AMD/Intel by providing useless functionality to the end user. Since SLI has to ultimately go thru a single bridge to the CPU it doesn't matter how many processor you toss in a system, you'll not see huge performance gains. Maybe X2 will have substance once motherboards/chipset mature and have their own separate memory bus, PCI-E bus, etc. etc. I'm hoping someone does produce a good motherboard that can truely and efficiently make use of multiple PCI-E and multipe CPU cores -- and then we have to hope developers (like myself) get the green light to code CPU specific threads (hard sell when someone is talking about a $50 game in which maybe 200,000-1,000,000 units sold (after eveyone takes their share you looking at maybe $10 going to the development company).

Rob.