Spud, I'll repeat what I had said before about this:
-3dMark, is designed for one thing only: Show how far a card can be stressed in the event of absolute crap optimization. This is important for several reasons, namely the presence of monkey programmers, that did the graphics for games like Commanche 4 and Splinter Cell. I hope they got fired, because they made games that a 4GHZ Pentium 4 can't even pump 80FPS from, with the best of graphics cards.
That is 3dMark's best use, to show you how much your card can withstand. Can I believe that one day, if I played a DX 8.1 that looked like Battle of Proxycon, but was badly programmed, that I can get about 5FPS like I do with my Ti200 in that test?
YES! Because as nVidia explained, the test uses very a taxing and useless amount of light sources, among other things. It is a credible test for such reasons, BECAUSE IT SHOWS YOU WHAT THE WORST CAN BE.
And as long as graphics programming for games like Commanche 4 and Splinter Cell pumps itself into our market, we'll need such programs.
I strongly believe that no optimizations are needed in this benchmark. Because, as I said, it is there to test the rawest of rawest of performance. And yes it is possible, when you have bad programmers.
-The reason, as I and many have said several times, that we have gripes about these optimizations or cheats, is that some of them LOWER your image quality (while you claim they give higher). And these are CHEAT-class codes.
Now you did tell me in our conversation today that it's not always noticeable. I agreed, however I will say it again: with clear water effect degradation in 3dMark03's Nature test on some geForce FXs (COULD be driver bugs of course), and some other things, it can be noticeable, and I would not live with a 500$ card not even properly displaying my textures.
Carmack himself says that Nvidia's FX line will be faster in DoomIII, this man has huge relevance because hes down in the source code seeing the ticks its takes for code execution and how many texture passes he can do with each card hardware.
I strongly believe in Carmack and his credibility. But here the issue is nVidia's drivers. If they are making cheats to make Carmack think they work so well, then it's a whole new story. Judging image quality in Doom III between cards should become even more easy due to the sharpness and advancements of detail.
In response to your SSE 2 on Pentium 4, the issue is, these optimizations are not visual effects but speed. That means a card could have an optimization integrated to load faster or render faster (as the Radeon 9700 and the geForce FX 5800 have been capable of, live rendering). But when the optimization is visual, it HAS to stay consistent. If the Pentium 4 rendered the scenes rather than the GPU, and SSE2 was included but actually Intel used a trick to lower FP SIMD precision to 64-bit than 82-bit (or whatever SSE2 was using), then can we still say it's a fair trick? Not if the visuals really end up worse (yay for software rendering!).
Spud I hope you can see our POV. I believe what I said to be fairly representative of a good portion of the disappointed people here. We do not want image quality drops, and we do look at 3dMark as the test that sees what cards can do if left on a desert island with no tools but their own body. (or circuits, lol)
--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue: