What I am seeing in the different reviews is that in older games, where framerates already exceed 60fps (usually well over 100fps) the 7900GTX is ahead (but not always with AA).
Turn the tables to high resolutions with AA on modern games (like FEAR) and it is the opposite.
A lot of this has to do with the "art of the review". I remember post-E3 2005 Nvidia had passed out some PowerPoint slides to the press, and specifically in them they were talking about how "True HD resolutions with AA are here!". Their point, seemingly in defense of RSX, was that AA had a minor impact on performance. To prove this, they talked about how 1600x1200 had a similar demend as 1080p and then proceeded to list a dozen games with, and without, AA and its performance impact. Of course the PROBLEM with this little excersize--which demonstrated a negligable impact on performance!--was that the 3 newest games at the time (Half-Life 2, Doom 3, and Far Cry)--all had 40%+ drop in performance!
In a nutshell the excersize was meant to skew the important facts, namely that in games that were *CPU bound* and older game designs (where each setting was well over 100fps) were hardly impacted by AA.
I think the same critical eye needs to be put forth on the X1900XTX and 7900GTX.
I have not read enough reviews to come to any conclusion, but I have a hard time calling the 7900GTX the performance king when I see numbers like this in a MODERN game:
51fps vs. 42fps @ 1600x1200 4xAA 16xAF
61fps vs. 38fps with Softshadows @ 1600x1200 16xAF
Similar results in BF2 and CoD2. So far it seems the 7900GTX is winning some benchmarks (Q4 for example, and a number of older games) but it seems, in general (not always of course) that when framerate becomes push-and-shove and you NEED to squeeze out performance in modern games to make 60fps the X1900XTX is the one pulling ahead.
I am interested to see some detailed benchmarks on the Dynamic Branching and Flow Control (SM3.0 stuff) in the 7900GTX. If it is on par with the 7800GTX 512MB OCed ton 655MHz, then we are still looking at the X1900XTX beating the 7900GTX by 300%+ in such scenarios.
Hard for me to crown it "King of Performance" when it gets trashed in SM3.0 and struggles to win in Shader heavy modern games at higher resolutions with AA. Not that it does not win many, but on the whole a lot of the benchmarks I am seeing are showing the X1900XTX walking away in newer games.
And with shader heavy games coming (that may begin to leverage the shader array advantage in the X1900 series) and more SM3.0 games coming (especially since both consoles are SM3.0 compliant) this issue could become further exesperated.
Hopefully after reading more than a couple reviews I can give more thoughts, but I think this thread title is a little hasty.