Sorry to resurrect this topic, but I neglected to post to some things directed at myself. I’m not sure what the general atmosphere is for resurrecting topics on THG’s Forumz is, but usually, one week isn’t all that bad if it’s brought up for good reason. Again, I apologize in advance.
Well, prices definitely have changed, at least in the USA. Unfortunately, this is AGP in question, so the only improvement of note would be the new GeForce 6800GS, which isn’t all that badly priced, at about $225US. (
link)
And on another note, how is it that some are taking me to be an nVidia fanboy? I do know that my post regarding the two cards did perhaps give more credit to the 6600GT than it was due, but that would primarily be because the performance advantage of the X800pro over the 6600GT needed very little to be said on it. For any interested parties, I use a Radeon X800XT anyway.
The thing about "FP-Blending" is that it can be reproduced using other techniques, for the ATis it would require 3 passes to achieve the same effect, and I'm pretty certain the X800Pro could do 3 passes in nearly the time it take the GF6600GT torender a scene in FartCry with HDR enabled. Of course you still need to code for that too.
“
FartCry...” I’ll have to remember that one. I was fairly certain that the X series cards could handle filtering and blending in floating-point mode, as I’ve seen what is clearly examples in their own tech demos. (being the video card junkie I am, I have all of ATi’s tech demos for the Radeon 7200 through the X850, the highest that will run on my X800XT) However, as I said, if no one bothers to program using a feature, it’s worthless, and thus far, I’ve seen no one use it.
However, recent information regarding the upcoming
Elder Scrolls IV: Oblivion found at Beyond3D suggests that the game will be an exception. Apparently, it will not only use floating-point color for its HDR method, it will possibly even work with MSAA even on any SM 2.0-complaint card, though what was said makes it inconclusive. It was noted that, for the abovementioned reasons, that “blending” was impossible to do without a significant performance hit, so it’s likely what we’ll see is simply a failure of some blending effects, such as alpha blending... And honestly, when’s the last time anyone really cared that any game got alpha blending correct? If memory serves me correctly, it was this acceptance of this problem that lead to the development of special AA techniques like nVidia’s “transparency super-sampling” and ATi’s “selective AA.”
FartCry does use SM2.0 extended, that's where you get geometric instancing support, and alot of games are adding that since the benifits are to all R3xx series cards and above and all GF6 series cards and above. The edge to the Geforce in that case is negligible at best.
It does? I honestly wasn’t aware of that; the game looked the same on a Radeon 9600XT as it does on an X8x0 card. Then again, I’ve heard some suggestions that the “XT” 9 series cards were SM 2.0 extended as well... And I have no other truly SM 2.0 cards; I think the “PCI” part of my old GeForce FX 5200 kinda nullifies things, as it won’t even run
Far Cry,
Halo, or any other game with a SM 2.0 mode.
Except when the Sapphire X800Pros were selling for the same price as the GF6600s in the middle of last year. Now they're rare as sin, but second hand, oem or refurb this may create the same scenario, so 'insane' is a little strong.
And lo and behold, we now have X800GTOs (128MB version) from Sapphire selling for close to the same price as 6600GTs right now!
I find that to be a pleasant surprise, now if only they became available for AGP...
Now that's insane talk!
Well, it is what I’ve seen at points. It’s much better now, but primarily because the 6600GT seems to be drying up and raising in price. But at some points, I noted that the X800pro would cost in the $240US neighborhood while the 6600GT was around $120US.
Yeah, but price performance will likely still favour the X800Pro, and also give you the benifit of having playable framerates longer IMO.
Of course like I originally said, the X800GTO or GF6800GS would be the better choices for price/performance.
Indeed; my very first line was the comment that as far as performance goes, the X800pro can typically hand a 6600GT its rear end. I guess I didn’t mean to give the illusion of emphasis to the other qualities of the 6600GT so much, but in retrospect, it seems that I did considering how much space I devoted to them.
The X800 PRO will absolutely RUIN the 6600 GT.
It would still ruin it if the 6600 GT had 1024 megs of ram and the X800 PRO had 128 megs of ram.
The interface (AGP or PCI-E) makes no difference, more RAM or no.
SM 3.0 is maybe worth less than a half-point to the 6600 GT, especially since my 6800 Ultra is not fast enough to run the SM 3.0 HDR in far Cry. The 6600 GT doesn't have the horsepower to do squat with SM 3.0.
It's a checkbox feature, nothing more.
The X800 PRO has 12 pipelines and a 256-bit memory interface, the 6600 GT has 8 pipes and a 128-bit interface. No contest. Even if you were a total Nvidia fanboy, you'd be supremely retarded to choose a 6600 GT over an X800 PRO. The X800 GTO would also kill a 6600 GT.
I'd even be hard pressed to choose the 6600 GT over an X800 GT, but at least that's a much closer race. The X800 GT has the 256-bit memory interface (good for AA & other eye candy), but the 6600 GT has very efficient architecture and, in such a close race, SM 3.0 might come into play as a factor.
Well, as I commented, I do not disagree with the fact that the X800pro would slaughter the 6600GT in a fair performance fight. As for the interface comment, it can make a difference where texture reading is concerned; a PCI-e interface not only has about double the overall bandwidth, its multi-lane serial nature makes it far more flexible concerning re-proportioning bandwidth on the fly, to accommodate reading textures from the main RAM, as would be the case if the game has to buffer more in textures than the video card can hold.
The SM 3.0 is a wild card, heavily dependant on the game. As far as I’ve actually been able to find out, it really offers nothing truly improved over SM 2.0 extended, and it just merely makes me bitter that so many game developers have slighted those with SM 2.0 extended cards, making a SM 3.0 path instead. (and it’s not just that I use an X800XT) That path often only offers things that could’ve been accomplished simply with SM 2.0; from all information I’ve found, it seems that the abovementioned
Oblivion game is a prime example; it will apparently incorporate all the soft-shadow and HDR technologies seen before and more, and it will all run in SM 2.0. (it will run much faster with SM 2.0b or SM 3.0 support, as SM 2.0 will require many of the things to be done in many passes instead of one) And again, as for what amount of power is really necessary to do something with SM 3.0, just as with how much power is necessary to use so much video RAM, is dependant on the game, not the card.