I've just read a very intressting article which I though I would bring to the table and see what you guys/gals/aliens think about it:
Radeon X800 XT or GeForce 6800 Ultra? Tough choice
This is just one of those days when you have to pass judgment on two products that are both excellent but designed from two different perspectives. At one hand you have ATi’s Radeon X800 XT, small, quiet, low power, but with plenty of processing power. On the other hand there’s Nvidia’s GeForce 6800 Ultra, big, powerful, lots of raw processing power and plenty of new features, but also requiring more power. The Radeon X800 XT reminds me of a polished ruby, as there’s already a solid base of drivers because it is based on a previous architecture, the Radeon X800 XT will therefore be able to shine from the start, but will also not be much more than that, a polished ruby.
The 6800 Ultra however is a diamond in the rough, freshly excavated and off to be cut and polished. That cut and polishing process is what’s going to make or break the diamond, it can be either a true 24-carat diamond or one of lesser quality. What I’m referring to is that the success of the GeForce 6800 Ultra is largely determined by how Nvidia chooses to tap into the potential locked inside this new architecture. We’re talking about drivers, industry support for pixel shader 3.0 and 32-bit rendering, all these things have to fall into place to turn a rough diamond into a true 24-carat diamond. I’m quite sure Nvidia will go the distance to make sure their new architecture delivers on its promises but it’ll take a while for drivers and industry support to mature.
So at the end of the day we have the Radeon X800 XT, a card that’s able to perform right out of the box, which has drivers that are mature enough to receive a WHQL certification soon but also has a performance level that’s basically set in stone. ATi has been working on similar drivers for the past two years, so don’t expect big leaps in performance, a marginal increase is about the best you can expect. And then there’s the GeForce 6800 Ultra, with release candidate drivers at best, drivers that show promise, even though they’re still an early build. But the potential is there to boost performance significantly, offering performance levels far beyond what we’ve seen up till now.
And how about the clockspeed both these graphics processors run at? ATi used to have a lower clockspeed than Nvidia, yet offering better performance. Things seem to have reversed now, ATi needs a 520MHz graphics processor to keep pace with Nvidia’s 450MHz processor, this means that Nvidia’s new architecture is faster and more efficient. Both use a 0.13-micron process to manufacture their processors, Nvidia talked to IBM and cut a deal with them, whereas ATi turned to TSMC for their silicon. Both these manufacturers have what it takes, but neither will be able to go around the clockspeed limits imposed by the process used. About 600MHz is as far as you’ll be able to clock a >150 million transistor 0.13-micron part. This means that ATi is already giving itself very little headroom, Nvidia however has got some room to play, a good 150 MHz, double that of what ATi has got left.
All things considered ATi’s Radeon X800 XT is an excellent performer, but also one that has little headroom left. If Nvidia has good yields on their parts and decides to crank up the clockspeed to 500MHz tomorrow ATi won’t be able to keep pace. If they also release a new driver which boosts performance by a considerable margin ATi will be left biting the dust. ATi however has a part out today that has mature drivers and excellent performance, Nvidia has yet to ship their first card based on their NV4x architecture. I’d say the best card to buy today is ATi’s Radeon X800 XT, as it is a safe bet, with solid drivers and performance and it'll be a top performer for a while. However tomorrow, I expect much more from Nvidia’s GeForce 6800 Ultra, ATi had to push real hard to keep pace with the first product of a new generation. To me that’s a tell tale sign that Nvidia's new architecture holds a promise.
Why downplay PS3.0? It's a big + for Nvidia right now.
Ye, it's a good feature. But it's not yet in use in many titles and the titles that will come out for at least the next year will all support PS1.x and PS2.x.
No games studios will sell a game that requires PS3.0 for a long time, because they all know that it will take time before mass of gamer will have PS3.0 cards. Even today's games like FarCry run with PS1.x even if PS2.0 is there for more than a year.
And by the tiem PS3.0 will be required for games, GeForce 6800 and X800 will be outdated products.
Lookin' to fill that <font color=blue>GOD</font color=blue> shape hole!
When it's proven to be either a tremendous asset or a mere marketing feature, then we'll know.
Until then, it's a question mark.
As for this article, it kind of ignores the fact that the slower X800 Pro's shaders are more powerful than the 6800's when it comes to PS 2.0.
The 6800 is outclassed in raw power but it remains to be seen if it's PS 3.0 finesse will actually be able to provide the leverage it needs to be long-term viable.
Long term... I don't think that's gonna be good for the 6800Ultra, since NV50/R500's coming out late this year. And they'll both have PS3 support, and moved to 90nm process. I'm pretty sure those cards will tear 6800ultra to pieces at PS3.
The 6800 is a great card. When they're released you can't heckle somebody who owns one.
However when choosing between the two cards I think you'd be hard pressed to make a decent argument in favor of the 6800 over the x800 unless you have some specific objective that the 6800 is a bit better at.
<font color=red>_______________________________________________</font color=red>
Nov. 6, 1971: "I gave back, I can't remember, six, seven, eight, nine medals"
Hmm, cut and paste article. Why not just post a link?
Oh well easier to quote I guess.
Not sure about gemology comparisons (I guess fact checking isn't required), but a 24 carat diamond isn't necessarily better QUALITY than a 2 carat diamond. Depends on Clarity and Colour 2 of the other four C's of diamonds. A 24 Carat industrial grade diamond will get nowhere near the money of a 2 carat Flawless D or even VS1 D. Once again clarity and quality come into play.
BTW, cutting and polishing turns a 24-carat diamond into something less like a 16-20 carat cut&polished stone (plus chips etc), if you're LUCKY, not simply another kind of 24 carat diamond.
On to the rest;
ATi has been working on similar drivers for the past two years, so don’t expect big leaps in performance, a marginal increase is about the best you can expect.
Not really, they just added a new compression strategy 3Dc (which may be more useful near term than PS3.0), plus new AA and AF features, so it's far from being all you can expect.
And then there’s the GeForce 6800 Ultra, with release candidate drivers at best,
Except they too got WHQL certification also, and if are commenting on all the issues, then that describes the past and likely describes the future too. Both companies have features to address in the new cards, and I doubt that either company will reach it's performance apex for a while.
Early build or not, both have had time to tweak them for launch, now they'll both be cleaning up their missed tweaks.
But the potential is there to boost performance significantly, offering performance levels far beyond what we’ve seen up till now.
We'll see, but nothing's certain. 'Far beyond', yeah I'll believe it when I see it. Some of the increases so far aren't ones I'd be interested in. Performance above IQ is something I can tweak myself with sliders.
this means that Nvidia’s new architecture is faster and more efficient.
Drawing alot of conclusions in favour of something that is unproven. Doesn't mean either. The reason is simple, nV couldn't get the same speeds with their non Low-k process that ATI could in enough numbers to make that the standard.
PErhaps that will change when switching to TSMC, but that's still an 'if'. The gainward cards could give us alot of insight into equal clock speeds.
And sometimes the X800Pro performas better than the Ultra Platinum despite having less pipes. It's all relative. Each card has it's strengths and weaknesses. Alot of it depends on the applications involved. In OpenGL alone the higher core speeds of the ATIs do little to bring parity, and in many PS2.0 games the extra pipes of the NV40Ultra don't help it aganst a X800Pro. But the performance levels are close enough to make it completely application and even setup specific.
Both these manufacturers have what it takes, but neither will be able to go around the clockspeed limits imposed by the process used.
Except for they don't use the same process. TSMC uses Low-K/D silicon and thus they have a much higher upper ceiling. nV balked at Low-k because they said it was 'dangerous and reckless'. So it's not true that they both have the same limitations, and the speed limit given is FOR the Low-K process, and even then it's a guesstimate. For nVidia knock off a few mhz to find their current speed limit until they go Low-K or switch processes. So the headroom is about the same. We'll see exactly what it REALLY is once the Bungholio artists OC the crap out of them.And considering the size of the NV40 vs the R420, we may find that it overwhelms cooling solutions quicker than the ATIs, but that's just conjecture at this point, but one that alot of reviewers seem to agree with.
If Nvidia has good yields on their parts
Which they don't. They don't have enough chips that'll reach UltraExtreme to start wide distribution TO THEIR PARTNERS, let alone start selling them, or even speaking of overclocking them further. Add to that the switch from IBM to TSMC, and the delay there, I wouldn't expect great yields anytime soon.
and decides to crank up the clockspeed to 500MHz tomorrow ATi won’t be able to keep pace.
They are having trouble reaching the 450mhz for the UltraExtreme on enough chips, what makes you think 500mhz is going to be a breeze? IT's not a question of deciding, I'm sure they can decide to do so very easily, making it happen is something else entirely.
If they also release a new driver which boosts performance by a considerable margin ATi will be left biting the dust.
If. If.. If... Alot of blind faith in those words, and definitely far to early to make predictions like that. IF XGI decided to launch the DUO-X tomorrow and clocked it at 4GHZ per chip (4 of them of course), then there'd be world peace. Right?!?
However tomorrow, I expect much more from Nvidia’s GeForce 6800 Ultra, ATi had to push real hard to keep pace with the first product of a new generation.
Not really, if he knew enough about what's happened recently he'd realize that obviously ATI didn't have to push hard enough, because as Crash and I said, if they did feel that heat, we'd likely be seeing a whole different card/chip on the market.
To me that’s a tell tale sign that Nvidia's new architecture holds a promise.
It does hold promise, but whether it pays off or not depends on alot of factors yet to be determined. Heck like Cleeve said, 3Dc may have more impact than SM3.0 for the shelf-life of these cards. It's nice to have future features, but not if by the time they are useful there's something else walking all over those numbers/cards.
Right now both are good cards, but neither is impressive enough to declare much of anything other than, "Hey, they're finally Here!" and "Now on to the NEXT Generation!".
- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK