Oh boy. The long drawn out battle posts with Ape. Sometimes I think you just use the tactic of wearing someone out! LOL
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.
Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.
Very selective memory you have there, when the GF7 came out the X850 still had better standard AA, and when the X1800 came out it continued the better AA with the added levels. The only time nV even had an edge was with the addoption of Transparency AA, and then that got negated too. nV's SSAA was the only thing comparable, but at a heavy performance cost, so it was never worth enabling versus upping the resolution.
I was referring to GF7 vs X1900. I'm not going to go over ancient history with you. Though you know I'm more than capable of doing so.. whats the point?
All this conversation you posted above doesnt mean anything.
In the end, NV had better AA... ATI had better AF.
Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate.
Wan't even a question of going out on a limb since the difference was between useable feature HQAF and unuseable feature SSAA.
I used 8xS. It was very playable in certain games and resolutions.. also with SLI if one had it.
Its a limb. It broke and you fell (apparantly on your head).
Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.
Many sites? Like who Anand? Doesn't matter how much 'more work it does' if the results don't match that work.
The results did match.. there was never a clear cut winner in the GF7/X19 race for IQ. I'm really sorry. :roll:
This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.
If that's what comes to mind then you really need to reconsider what you remember from the era. The other thing to remember about GF7 AA was no FP16 HDR + AA in hardware, so really what is it I'm missing about their being deserved reservation about what we don't know yet (like DX10 features, support, etc).
I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.
Yeah but you're missing what I'm saying, prior to the release of the X1800, the GF7800 had the same angle-dependant AF as the X800, and similar AA with option for extra. So at the time it too could be considered as the hands-down king of IQ... until the X1800 showed that they still had work to do. The GF80 beats the X1900 no argument there, and definitely the GF7 series, but whether it will keep the title or whether there is still more to be known about feature interaction is far from determined at this point. Like I said my reservation involves other possible issues that rarely get tested by reviewers, and usually wind up in either nVnews' forums (for nV issues) or Rage3D's (for ATi's issues). I suspect the next real test is when DX10 software and other compliantr hardware makes it to market to truely test the new architecture.
To sum a long story short. There really hasnt been a clear cut IQ champ.. at least not as clear cut as this Geforce8.. in the history of GPUs.
Possibly 3dfx vs early ATI (which wasnt to hot). Or R300 vs FX. Even there though, including the FX AF.. there wasnt a head and shoulders winner.
So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing..
No they didn't, like I said, nV AF was fine at the time because no one knew that the AF on the R520 would be angle-independant and that it would matter that much with the shimmering. Also either no one knew or no one reported that there was the FP16HDR+AA limitation.
NV AF was inferior. Thats true, but whats the point. NV AA was still superior.. and shimmering was -not- cured by either the x1900 or GF7 series. If any of the X800/x1800/x1900 had accomplished that, I think people wouldve came to a concensus that ATI was the crowned champ of IQ.
But shimmering still existed on both to an extent. Yes far less on ATI. But still existed, making it less of a celebration for ATI as "victory".
but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.
It wasn't about not using nV, it's about the current hyperbole about how wicked everything is (including yours about perfect AF [BTW you didn't notice that the AF still shows signs of feathering?]), and not the potential limiting factors.
This is not specific to any one card maker either, it's just about reviews in general, most are more extensions of ADs for the IHVs, not investigations into the actual hardware and features,functions,limitations. That's why still prefer B3D reviews because it's more about the features/functions than just FPS. [/quote]
Perfect as in, as good as it probably is going to get. I'd be shocked to see a card with better AF and better performance.. It will happen but this is so close to being perfect that its astounding.
Anywho the G80 is undeniably the current leader of IQ, I'd like to see more tests for certain features (especially 2D video playback), but the true test will be once the intended DX10 features and hardware start getting full once overs to compare. That's why I'd hold off calling anything perfect.
I think the D3D tester is proof enough in AF domination.
As far as testing everything out, such as video.. I'll be able to do that myself soon enough.
You should get one
Test yourself
But the card is obviously spectacular, but I am interested in seeing what changes/improves with newer driver revisions. As the driver is currently a seperate download from the rest of the sets.. its clear its still cutting edge software development at Nvidia for this card. They will probably work on video later, right now they need to get the most games working well and improve performance as much as they can.