It still strikes me as odd that 500MHZ was needed for 350M triangles.
Seriously, it has always been that a MHZ equals 1M triangles, but it seems to be this card either has over 30% pipeline bubbles per second, or that it has some shared architecture that lowers the total amount of processed triangles.
Not only this, but the 500MHZ core was most likely mainly for this reason, to get the extra triangles out, yet that "quantum" 175MHZ jump warranted the death of the beginning of 0.13m video cards.
At 68ºC, 75Watts, with expensive and noisy cooling, this is most certainly not recommended for any AMD system running Thunderbirds, with a very closed and isolated case!
Not to mention it could probably transfer and burn a PCI card in slot 2!
If nVidia really worked hard on it, they could've been able to create more triangles per Hertz like the R300 does (having one per Hz), clocked it at or below the R300's core clock, AND removed such cooling, AND used perhaps passive cooling since they run on 0.13m, AND reserved so much headroom for the FX, and have gotten a still more performing card. Let us not board onto the death of DDR-2 as we know it, going for the overkill 1GHZ DDR-2 bandwidth, at 128-bit (wow, and it runs hot at THIS 'outdated' bit width), and manage less bandwidth. 48GB/sec of theoretical bandwidth, yeah right!
No, nVidia went for such raw weakness, that it is even more shameful than the Pentium 4's departure from high IPC. If nVidia did it right, they'd have done what I said with the core clock, went for much slower DDR-2 but used the real 4-bit prefetch, OR switched like ATi to 256-bit width, and could have doubled their bandwidth and eaten ATi alive.
Instead they went for such very questionable component speeds, that it feels weird that nVidia is even doing this. With the current comments, I feel this board will become quite easily a pro-ATi board for a good while. This is what happens when competition gets fierce, and suprisingly, head honchos like nVidia failed miserably. It kind of shows that maybe Intel could have the same scenario unexpectedly. (though it won't, the Wilamette P4 was a humiliation yet it sold well thanks to Intel's OEM support and marketting, and competition pricing)
My friend threw 2 lines of HAHAs on MSN after he heard the MP3s. Seriously, no one in their right mind can survive THAT. Even my Volcano 7 in summer was ok compared to this!
Free Swiftec MCX462s to everyone!!!
Oh and let us not board the amount of glitches it comes with.
Look, I can understand that a new generation card will come with problems, geForce 3 was exactly that. (and even worse at first) But unfortunatly this is now a time of extreme competition. ATi back then was barely competing, nVidia had the market! Now, since ATi has created and is still creating more and more fans and potential buyers, nVidia's low performing and buggy card CANNOT do such errors in this capitalistic market. This time, they CANNOT. But I like the title of the article, well said THG.
Those who said nVidia drivers are fool-proof should reword their statements, because now even the geForce 4s have been uncovered having Zbuffer glitches.
As an owner of a geForce 3 Ti200, I must say, this card of mine feels better than the new generation of nVidia's, and I am sorely and very much disappointed.
However, the card is not without its highs, it is good to see it perform decently well, and it did have some nice rises occasionally (not to mention a very surprising 30MT/sec score in the 8Light test of 3dMark 2001, a two-fold jump which is simply eye opening), however, most of them were in the synthetic benches, which is not what we want to look at.
Even with the current R9700PRO's rather weak driver performance (as stated and seen), it was able to compete so well.
Can you guys imagine the cheaper 5800 plain version? Maybe that 200MT spec found on a reseller was not wrong. Dear god, that card better sell for less than 200$ and with a god forbid more adequate cooler.
The thing that did however surprise me was the price tag. I thought it'd go for the 100% unsellable 499$, but it turned out 399$, which in a way improves the overall selling chance. But even then, only fanboys will be tempted, and even if it reaches a 200$ price tag one day, the noise level alone is a very big back turner which can affect even very low priced cards.
How on earth did nVidia continue with this card is beyond me. No really, it just frankly surprises me there was no improvement whatsoever! They PROMISED cooler noise improvement. And the benches we saw on the web seemed to indicate potential which was not even seen! I simply don't understand how can nVidia launch the card out like this with no refining, thinking in their right mind that it will sell for suckers, when it is already being bashed worldwide. It's like they are thinking with Intel's mindset, and their marketting prowess.
I now look at mr. Flamethrower, who has always praised nVidia. Please, do share your thoughts, as you so seemed to indicate it could buy you out.
I am really happy for ATi, I am sure they are now already predicting even more rising sales, as I bet over 30% of the market was holding out to see how the FX will perform and retail. Now that 30% has decided, it is going the other route. Hurrah, the stocks will rise, my school virtual stock simulation portfolio will rise finally!
EDIT: After checking some Anandtech benchmarks, which featured the 5800 plain, I am enticed to wonder, just how much will it cost. The performance difference is not huge, and if it can sell for less than 299$ and with a refined silent cooler, it just might be worthy. Even then though, the cards have so much driver performance problems, they tend to lose so often in many tests, so it is still a very undecisive situation.
--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden.
<P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 01/27/03 04:35 PM.</EM></FONT></P>