Confirmation of what was already known? Read the articles. They are conviently dated for you
Out of pretty much EVERYONE here I AM the one who reads pretty much all of the articles. Perhaps you should read them. Sure the REVIEWERS (especially THG) didn't focus on poor PS2.0 performance and ShaderMark results, but the reality is MOST of us here already knew it. HL2/Shader Day was simply a confirmation that that would prove an issue in ACTUAL games and not just synthetic benchmarks (the excuse of people wishing to try and promote nV).
You also seem to not acknowledge DX 9 games that show the FX ahead or neck in neck with Radeons
You show me one and then I'd agree. But I doubt you could show me one that involves PS 2.0.
Doom 3 and X2 jump to mind.
Well first DOOM ]|[ uses OpenGL for it's graphics and only DX AUDIO and other peripheral control (perhaps you should do some more DATED research); and second, that was running the nV-centric path in those benchmarks, in fact Carmack (you may have heard of him) says:
<i>"Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec."</i>
But then again I guess that passed you by as well.
You can see his comment about different paths for the nV and ATI for D]|[ in <A HREF="http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm" target="_new">THIS article</A>, which may also help you understand a little more of the implications.
As for X2 read Lars' comment in HIS review here at THG: <i>'This demo, which includes a benchmark mode, gives you a pretty good feel for what Egosoft's next game "X2 - The Threat" will look like. <b>Although the engine doesn't use any pixel shaders</b>, the graphics are nonetheless impressive.'</i>
So there's your answer right there. Nothing to stress, therefore the FXs do as well as they do in any DX7/8 game.
Not to mention Aquamark 3.
Why mention it it's barely got DX9 components, and the FXs lose to the Radeons under normal conditions, and turn on AA/AF and the FXs get slaughtered. The only time the FX's pull a SLIGHT lead, is when the benchmark is run without AA/AF which is NOT the default. Can't play fair yet again.
The Tomb Raider numbers are not extreme.
No they aren't extreme, but neither is the level of shader effects and such. Turn on LOD and such and you'll see the FX5900U perform worse than the R9600P under the standard path, the FX's need to use either CG or run-time recompiling into CG in order to come close to the Radeons perfromance.
.. schtoopid analogy edited... the only case for poor DX9 performance on FX cards is...Tomb Raider AOD where ATI's Lead is about 30%?
That's not the only case, just the only one YOU know about.
The lead for Nvidia in Doom 3 is 30%.
According to what? Oh yeah that's right your DATED nV-centric benchmarks, yes I forgot you're not very current with your references.
Simply put, see above.
And I'm not even going to go into Open GL where...
You know even less? You ignorance above makes me think we should thank you for keeping it to one subject you know nothing about. And when it comes to even OLD OGL games there is barely a diff. between the two, and newer games, well I guess you already read what Carmack had to say about one of the most anticipated OGL titles, so I don't see a reason to recap that.
EDIT: and go look at <A HREF="http://www.hardocp.com/article.html?art=NTQwLDY=" target="_new">[H]'s Call of Duty Benchmark</A> in today FX5700 review. Guess the Radeons can handle OGL just fine! Guess nV needs to do some more optimizing for that game too.
The NEW DET 52.16 drivers help the FX's reach nearly the same framerates, but it's only once the cards are 'optimized' and even then, they don't always perform (a draw back of on the fly recompiling IMO), just look at the Max Payne 2 benchies sofar.
I think that about covers it. You go an buy/keep your FX card, no one's stopping you, they are preety good cards in many areas; just don't try and use your ignorance to try and convince others that the graphics card world is flat and all cards are created equal.
Do some more research and try to actually READ the articles, and not just look at the pretty pictures.
EDIT: I thought that covered it, but I didn't read on to your other inane rambling (or seeing others equal dismissal of your arguments) before replying to your first post, so I'll comment on this other bit of ignorance you spew;
The 16 point precision argument always cracks me up because when Nvidia is using 16bit precision to compare to ATI's 24bit everyone cries foul play because the Radeon's are rendering higher quality. But no one has a problem saying it's fair when Nvidia is at 32 and ATI is at 24 even though the FX cards are now rendering superior images.
Beyond the DX9 standard that others mentioned, here's the other reason it matters. MOST (almost everyone) can see the difference between 16bit and 24 bit precision, whereas the diff. between 32 and 24 bit precision is almost imperceptable. While that isn't the case in workstation cards doing DIFFERENT work, in most gaming situations that is the case. So ATI's solution while not the optimal image quality, is the perfect balance of IQ and speed. The interesting thing is that even when the nV's run with 32bit precision most reviewers say that the ATI's look better. So what benifit is their at all in nV's method? None that I can see. You can harp on it all you want, but unless the IQ is the same (which it isn't running 16 bit), then there's no comparison. In order to reach parity the nV's will have to run at 32 bit and then the Framerates suffer greatly, to increase FPS they use partial precision which causes alot of missing effects. So you chose which one you want, lesser framerates or lesser quality effects.
- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK