Sign in with
Sign up | Sign in
Your question

R600 and ATI's Future-The cold hard facts

Last response: in Graphics & Displays
Share
May 28, 2007 4:11:23 PM

and you are???

also the very first bit: "ATI was already the distant 2nd place finisher for 3 generations running (6800, 7800, 8800)"

sooo what was the x1900xt doing?? i agree with the 8800 being superior...but the 7800 and 6800 to the x1900???

what planet are you living on?
May 28, 2007 4:19:27 PM

TBH I haven't yet read you're whole post. I got far enough to read the incorrect statement about ATI being behind for 3 generations and you're credibility was lost at that point.
Yes I'm disappointed in ATI right now but I absolutely believe they can turn that around.
May 28, 2007 4:55:09 PM

I really doubt Ati/Amd would fall waaay behind competition. They're still making sales in the mid and low end segment of the market where all the money's at. And I hear the reason for the development delays is cos they're too busy with the xbox360 graphics chips.
May 28, 2007 5:26:14 PM

i agree :lol:  this is a funny post hahahaha

i mean dude , this guy dosent know shit about the computer industry hahahah

and oh yeah ..STFU AND DIE TOAD :lol:  :lol:  :lol: 
May 28, 2007 5:38:32 PM

ATi actually had the superior product in comparison to the Geforce 6 and 7 series, but they've never been as good as marketing their product the way Nvidia does.
a b U Graphics card
May 28, 2007 5:46:11 PM

Hmmm, okay whatever. Yawn....
May 28, 2007 5:59:11 PM

Quote:
The problem with competitive technology industries like GPUs is that when you fall behind consistently, you tend to keep falling behind and may even fall further behind. It's just not that easy to suddenly reverse course and switch gears, if you have a bad design, you might be stuck with it awhile. Just look at how long Intel was stuck with P4/Netburst, it took them years to follow up with Core 2. Intel was the dominant market leader and had all the OEMs to themselves, so they could afford to be screwed for years on end and have the time for developing a competitive design.

DAAMIT do not have that luxury. ATI was already the distant 2nd place finisher for 3 generations running (6800, 7800, 8800) mainly because they were so late with a competitive part all 3 generations. Nvidia doesn't have any interest in slowing down and waiting for ATI to catch up, so they are happily developing G90 which is rumored to have 192 SPs and be in the 700-800mhz range in clockspeed at 65nm. Nvidia might even stick with GDDR3 if they can get it up to 1200-1300 (2400-2600 DDR) as the new architecture that started with G80 is not particularly memory bandwidth intensive.

ATI is now in a very deep hole. Their part is highly inefficient and provides very poor performance per watt, especially when you consider that G80 is still at the 90nm process node and still runs cooler and uses less power even though it is a bigger die size than R600. Nvidia has delivered a part with a very innovative design that runs the SPs at a much higher clock than the core, resulting in needing fewer SPs and therefore their transition to 65nm will result in a smaller die than ATI's, improving yields further. Nvidia's part has a very efficient render backend and ROP/TMU design that gives essentially free trilinear and anistropic, which ATI's part is struggling with, and at a higher IQ. ATI also need to fix whatever problems they are having with their ROP/TMU design to bring MSAA back on board where it should be, instead of sapping performance by being done in shaders.

Overall Nvidia is in a very good position, as they pretty much hit every performance and IQ goal they had with the G80 design. ATI has a lot of work to do at the drawing board, which will result in further delays and put them further behind. At this point they might not have the R6x0 refresh part out before Nvidia has an entire new generation out in G90, and being a whole generation behind could be a deathblow for DAAMIT as they are also struggling to get Barcelona out and have it be competitive with Penryn.

The big advantage that R600 theoretically had over G80 was the enormous shader transistor budget. G90 is rumored to have a ton of shader power increase over G80 if the 192 SP thing is to be believed, which will negate the only big theoretical advantage R600 had outside of the completely unnecessary and probably expensive excess memory bandwidth. R600 is so much weaker than G80 in basic rendering design such as geometry setup, actual fillrate (even though the 512-bit memory bus and 1024-bit ring bus gives it a massive theoretical advantage), texture filtering, and AA that it's almost unbelievable that DAAMIT would have released it in the state it got released in except that we know that DAAMIT was already 6 months late and had to get something, anything out the door, plus financially DAAMIT is staring into the abyss.

Nvidia designed a powerful, efficient, balanced architecture with G80 and they can easily ride it for the next 2 years just with die shrinks and slapping more SPs onto it and clocking it into the stratosphere. ATI is in a big hole here, not unlike the big hole AMD is staring at with Barcelona versus Penryn, and they need to do something and it needs to be miraculous if they want to catch up with Nvidia now. Being a whole generation behind is the worst thing imaginable in the technology industry.

...And who the F.U.C.K. are you?
Lets take a look at previews generations:
X1900/X1950series > 7900/7950series (except that for 7950GX2 at high-resolutions!)
X800/X850series > 6800series (BTW it's not fair cause X800/X850series does not support Shader Model3.0 but i rather have an X850XT PE than 6800Ultra)
9800series > FX series
Quote:
What planet are you living on?

On a planet called ROB & Friends!
Quote:
...This guy doesn't know **** about the computer industry...

Obviously he doesn't know the difference between popcorn & graphics cards!
May 28, 2007 6:05:18 PM

I seem to remember the geforce 6 series being far and away better than the initial lineup of ati cards. I remember the 6800gt wiped the floor with even the best ati card in doom 3 for example and if you wanted to get good performance out of doom 3 and half life 2 which were the big games at the time, ati seemed to be suggesting you should use different drivers for the different games which having just spent ages trying to install drivers for an x1950 might have been even more awkward than it sounds. And i think the 6600gt was also a lot better than anything ati had competing with it for ages.

Of course, this is before Ati flooded the market with so many different models with different and seemingly random combinations of letters that i lost track, and i believe they carried on developing and releasing that series of cards longer than nvidia did with the 6 series so they were probably better in the end (by which i mean essentially when they were no longer the latest generation), but certainly not to begin with when i was researching what to buy (about 6months after gf6 launched).

And from what i have read i would agree that the geforce 7900s were probably weaker than the x1900, but i seem to remember the 7800 pretty much got away without much competition for a long time.
May 28, 2007 6:10:26 PM

i believe the x850xtpe was released a long time after the 6800ultra and the 6800u was better than the card it was launched alongside - the x800xt and a more limited (edition wise) x800xtpe which should perhaps be compared with the 6800u extreme if any of those were ever actually made in the end. and with a real comparison like this even the 6800gt beat the best ati card in some games (doom 3 for example)

comparing the 6800u to an x850xtpe is like comparing a 7800 to an x1900

edit: i should probably add though that i also stopped really listening after this "statement of fact" wording about the 3 generations even though i only really disagree with the middle generation, and that the post in general seems a bit... flamey (and a bit borrowed)
May 28, 2007 6:13:16 PM

Quote:
ATi actually had the superior product in comparison to the Geforce 6 and 7 series, but they've never been as good as marketing their product the way Nvidia does.

I'll agree to that. Intel had their ding ding ding intel inside, which sold them a bundle of duff chips for a few years. Nvidia has their catchy out of breath "Nvidia" thingomebobwhatchamecallit, which also sells a bundle. Ati have a sticker for cases and errr Amd have errr also a sticker.
Just goes to show spending millions on advertising on tv etc can turn into good sales.
Can't see the average parent going to the PC shop with the Ati and Amd names stuck in their minds from seeing stickers on PC cases.
May 28, 2007 6:19:43 PM

Quote:
ATi actually had the superior product in comparison to the Geforce 6 and 7 series, but they've never been as good as marketing their product the way Nvidia does.

Yup, totally correct. And being late doesn't help much either. Infact, I almost dare to say, you'd be better off releasing a slightly inferior product that's 4-6 months ahead of the competition, because it's the best at the time, and that's what people buy for. Most people don't card if there will be a card that out performs it by 10% 5 months from the competitors release.
May 28, 2007 6:19:58 PM

i dunno though, ati got their big red badge at the start of hl2 didnt they? I was thinking about all the hl2 fans who would have been impressed by that, but then i remembered about all the cs nerds too :o 

j/k though, i generally agree about the marketing, especially intel. somehow they've taken it so far that other people stop right in the middle of their own adverts to advertise for intel!
May 28, 2007 6:20:55 PM

Quote:
I seem to remember the geforce 6 series being far and away better than the initial lineup of ati cards. I remember the 6800gt wiped the floor with even the best ati card in doom 3 for example and if you wanted to get good performance out of doom 3 and half life 2 which were the big games at the time, ati seemed to be suggesting you should use different drivers for the different games which having just spent ages trying to install drivers for an x1950 might have been even more awkward than it sounds. And i think the 6600gt was also a lot better than anything ati had competing with it for ages.

Of course, this is before Ati flooded the market with so many different models with different and seemingly random combinations of letters that i lost track, and i believe they carried on developing and releasing that series of cards longer than nvidia did with the 6 series so they were probably better in the end (by which i mean essentially when they were no longer the latest generation), but certainly not to begin with when i was researching what to buy (about 6months after gf6 launched).

And from what i have read i would agree that the geforce 7900s were probably weaker than the x1900, but i seem to remember the 7800 pretty much got away without much competition for a long time.
The Geforce 6 series did outperform the X800 series in Doom3 by a decent margin, but the X800 series took Half-Life2, Far Cry, and just about every other Direct X9 game.
May 28, 2007 6:26:52 PM

i didnt like how farcry performed on my 6800gt actually, but i tried it again after the 4th patch had been released and it was a lot better. I think by that time it had made more use of the technologies that the ati cards didnt have. Just an aside though really since i can easily believe the x800s were better in farcry for a good while. So it would be unfair for me to count the performance boost which i think occured when those series of cards were old anyway.

for me there were really 2 games that needed to be looked at when deciding. nvidia was streets ahead in doom 3 and a little behind in hl2. Wanting to play both and the graphics being more important in doom3 since hl2 was a better game, i went nvidia. The extra technologies were mostly just a small bonus.
May 28, 2007 6:41:49 PM

you are just running around the same point of previous generations why don't you discuss the point that he was trying to make with the current generation. I don't know anything about this so I don't have anything to say.
May 28, 2007 6:49:34 PM

Well, while i dont see ati going down as fast as the opening poster claims, i do find mytself discouraged against ATI , i currently work as hardware analist and reviewer here in my city, i work for 4 oem resellers, well not exactly work but they give me green flag on whatever hardware i want to test, and i review it and then i give them the rundown on price/performance so that they can sell the cheapest faster/ more reliable hardware as mainstream and also can sell the best fastest hardware at a better price (and avoid selling 9200 or mx440 at 80 bucks and rip ppl off)

so yeha im disapointed in the r600 as well, having followed its development, knowing they had to re code things in order for them to work and resort to old technologies its noticeable,

what is worse perhaps its that since last years Oems here in mexico stoped using ati products as mainstream, nvidia products are cheaper than ati down here, by a 10-20% margin, which makes oems buy nvidia products even more..

i do hope ati can pull itself together and release something soon, i know they wont go down and out of business, but it just feels they had a whole 2 years and came up with a faster version of their last gen cards
May 28, 2007 8:00:13 PM

Quote:
i believe the x850xtpe was released a long time after the 6800ultra and the 6800u was better than the card it was launched alongside - the x800xt and a more limited (edition wise) x800xtpe which should perhaps be compared with the 6800u extreme if any of those were ever actually made in the end. and with a real comparison like this even the 6800gt beat the best ati card in some games (doom 3 for example)

comparing the 6800u to an x850xtpe is like comparing a 7800 to an x1900

Hey Steve, time does not have anything to do with the competitors!
As you can see the 8800GTS's competitor comes 6 months after that & 8800GTX's competitor hasn't born yet!(HD2900XTX or HD2950XTX i guess)
So i guess comparing X850 with 6800 is not like 7800GT to X1900. :roll:
And DOOM3 was designed for 6800series, so there is nothing special about outperforming the X800/X850 series in DOOM3 cause the X800/X850 series owned the 6800series in other titles such as FarCry,Half-Life 2,Need For Speed Underground II and etc.
Quote:
and you joined here just so you could post crap, we do not need your kind on these forums. take your trash elsewhere.

Agree with SS. :trophy:
Spartas needs to go to ROB's Forums!
Quote:
...However, in regards to what I was saying as to how things are now (in the Present) and how things could turn out in the future if ATI keeps slipping are 100% correct.

We don't need some "The-Future" thread in this forums. :roll:
BTW, There is no freedom of speach here!!!!!!!!!! :p 
May 28, 2007 8:14:19 PM

Well sorry i didn't see the similarities in the names there. Faceless rebel... Spartas... easily mixed up. Don't know what i was thinking there :p 
May 28, 2007 8:18:14 PM

"ATI was already the distant 2nd place finisher for 3 generations running"...? These aren't cold, hard facts, it's just nVidiocy.
May 28, 2007 8:30:07 PM

Quote:
and you joined here just so you could post crap, we do not need your kind on these forums. take your trash elsewhere.



I joined here to post my view. Freedom of speech anyone? :roll: If you cant handle the obvious facts and it brings you to such lows as telling me to PLEASE STFU AND DIE!!!!!! then I suggest you stop reading it.


Some of what I said about the past between Nvidia and ATI may not be 100% accurate as others have pointed out. However, in regards to what I was saying as to how things are now (in the Present) and how things could turn out in the future if ATI keeps slipping are 100% correct.

Only a die hard fanboy would say otherwise.

After lurking for so long, this thread finally brings out a post from me.

You titled the thread "the cold hard facts", and went on to post a lot of disinformation. Now you're trying to defend yourself by saying it's your "opinion". Freedom of Speech does not apply to slander. Did you honestly think that on a board for hardware enthusiasts that you'd not get called out on that, especially when you've provided nothing to back yourself up?
May 28, 2007 8:32:58 PM

Yawn. For a post that claims to have the hard facts, there's a whole lot of propeganda and spin going on there, lad.

If your flagrant assumptions were true, Nvidia would never have recovered from the dismal FX series with the successful 6x00 series. I could point out more inconsistancy, but why bother? From what I can see that was the crux of your argument, and I just put it in the crapper. Besides, people seem to have you pegged pretty accurately...
May 28, 2007 8:34:51 PM

You're confusing the r600, wich is the entire series based on this core, with the hd2900xtx, wich was going to be the topcard for this series.The hd2900XTX was meant to compete with the 8800GTX, While the currently released hd2900XT (note the missing X) is competing with the 8800 GTS
The XTX was a failure apparently and might be released at a later date likely on a 65nm core. Only then will ATI/AMD have released their topcard and not before.
May 28, 2007 8:57:48 PM

I know this is only a really minor part of the op, but time has everything to do with it. the 6800s had direct competitors at the time and the nvidia cards were better. the ati cards didnt own in hl2, they were just a little better, until hdr came along and then the 6 series was better in hl2 games too. however the 6 series was LOTS better in doom 3. A geforce 6800gt allowed me to play both games at 1280x1024 4AA8AF and be smooth. Even the top ati card wouldnt have allowed me to do this. Later on, when nvidia are making geforce 7 cards ati release some cards that will beat the geforce 6 cards, but of course they will, they should have been competing with the geforce 7 cards because like i said the nvidia cards already had direct competitors. While the geforce 6 cards were the latest, they were better than the competition, but then later on ati made some more cards and called them x850s or whatever. Thing is by this time the geforce 6 series was old. this is a big difference, otherwise you might as well compare things with 3dfx cards and say how crappy they were. When the cards were new nvidia > ati, but ati extended the lifetime of that range of cards so today most x850s you can find will be better than 6800s. So what, so the x series lasted longer than the 6 series, when they were both current, 6 series was better. I judging cards when they are relevant is a better judge of which series is better than taking a snapshot a year later when one side has just beefed up its range because its initial offerings were poor, and the other side has moved on already.

If nvidia had belatedly released a 5980 or something that's say slightly better than the best radeon 9800 was, while ati was releasing the x800s and nvidia was about to release the geforce 6 series, would the geforce 5s suddenly have been better than the 9000s as a series? Or for that matter nvidia could just release an 8600, call it a 7600xtxlsgto-PE++ or something, and all of a sudden the geforce 7 series was amazing!
May 28, 2007 9:10:23 PM

Spartas,

Can you produce a document from ATI saying the R600 architecture isn't supposed to compete with the 8800GTX. To this point I have not read where ATI said that it wasn't meant to. The XTX, if it were to be released, would still be R600, and would be meant to compete with the GTX. The R600 XT is the card that is supposed to compete with the GTS, which it does, and at times, it competes with the GTX as well.

wes
May 28, 2007 9:13:36 PM

Quote:

Hey Steve, time does not have anything to do with the competitors!
As you can see the 8800GTS's competitor comes 6 months after that & 8800GTX's competitor hasn't born yet!(HD2900XTX or HD2950XTX i guess)




I was just waiting on someone to say this :roll:


You know what I find funny? back around the beginning of this year up until around April all we heard was R600 is gonna kill the 8800GTX, R600 this and R600 that from the ATI Fanboys :roll:

You fanboys can scream and cry all you want to about the R600 only being meant to compete with the 8800GTS but you all know that that is utter and complete bullsh1t.

The R600 was orginally meant to compete with Nvidias best which happens to be the 8800GTX, Not the GTS. Since the release of the 8800GTX ATI has been trying drastically and desperately to get their R600 up to or beyond what the 8800GTX was capable of (Why else do you think ATI has delayed the card for so long?) :roll:


Finally after delay after the delay ATI realized they had to bite the bullet on this one, they underestimated the competition and lost the crown for the high end because no matter what their R600 was not going to be able to compete with Nvidias best despite their best efforts through pushing up clock speed.......etc......delays........etc........whatever they thought would help.

It wasnt up until right at the release of the R600 that ATI finally announced that the R600 would only be competing with the 8800GTS (There way of having to EAT CROW) after all those months of giving the ATI fanboys false hopes and dreams that the wait would be worth it.


So just remember FANBOYS that the next time you scream about the R600 only being meant to compete with the 8800GTS make sure you dont forget those same screams and cries over the past few months how the R600 was going to kill the 8800GTX and how all of us 8800GTX owners were so dumb for not waiting :roll:


Maybe after the 8800GTX has already been out for a year ATI will finally have a card that can match it :lol: 


HAVE FUN CHOKING ON YOUR CROW FANBOYS BECAUSE I KNOW IM HAVING FUN WATCHING IT :lol: 

repeat after me... cause you still dont get it in your tight head
R600 = A FULL LINE OF GPUS, just like G80 as all the SERIES OF NVIDIA ,8800gts, GTX, ULTRA, & similars..

*nevermind*
after reading his entire bunch of posts.. id say this guy is again ROBSLI, his trollist pro-nvidia behavior is very easy to spot..
a b U Graphics card
May 28, 2007 9:23:26 PM

Take your cpu mindset and go home. If you knew anything about gpu's youd shut up. Gpu's arent like cpu's as we will soon see in the midrange. Its not a clock for clock scenario here like a cpu. Im not sure whether to call you a nVidiot or a Intellidiot. Someone has some serious hate going on. People here arent lemmings. As Cleeve said, your crap belongs in the crapper. When you can post anything thats backed up thoroughly with other facts, then youll get a discussion, otherwise, you can keep being a fan, keep spinning out hot air, and making noise.
May 28, 2007 9:27:07 PM

its like we loose one ROBSLI and what not and another just sprouts up to take his place... :roll:
May 28, 2007 9:30:30 PM

Looks like there are a bunch of grumpy old men I here. Thats what happens when you become fanboys, not smart enough to do a perf/buck buy. And even if you are not or think you are not you can always ignore instead of turning all sub-human apes.
May 28, 2007 9:30:49 PM

Quote:
The problem with competitive technology industries like GPUs is that when you fall behind consistently, you tend to keep falling behind and may even fall further behind. It's just not that easy to suddenly reverse course and switch gears, if you have a bad design, you might be stuck with it awhile. Just look at how long Intel was stuck with P4/Netburst, it took them years to follow up with Core 2. Intel was the dominant market leader and had all the OEMs to themselves, so they could afford to be screwed for years on end and have the time for developing a competitive design.

DAAMIT do not have that luxury. ATI was already the distant 2nd place finisher for 3 generations running (6800, 7800, 8800) mainly because they were so late with a competitive part all 3 generations. Nvidia doesn't have any interest in slowing down and waiting for ATI to catch up, so they are happily developing G90 which is rumored to have 192 SPs and be in the 700-800mhz range in clockspeed at 65nm. Nvidia might even stick with GDDR3 if they can get it up to 1200-1300 (2400-2600 DDR) as the new architecture that started with G80 is not particularly memory bandwidth intensive.

ATI is now in a very deep hole. Their part is highly inefficient and provides very poor performance per watt, especially when you consider that G80 is still at the 90nm process node and still runs cooler and uses less power even though it is a bigger die size than R600. Nvidia has delivered a part with a very innovative design that runs the SPs at a much higher clock than the core, resulting in needing fewer SPs and therefore their transition to 65nm will result in a smaller die than ATI's, improving yields further. Nvidia's part has a very efficient render backend and ROP/TMU design that gives essentially free trilinear and anistropic, which ATI's part is struggling with, and at a higher IQ. ATI also need to fix whatever problems they are having with their ROP/TMU design to bring MSAA back on board where it should be, instead of sapping performance by being done in shaders.

Overall Nvidia is in a very good position, as they pretty much hit every performance and IQ goal they had with the G80 design. ATI has a lot of work to do at the drawing board, which will result in further delays and put them further behind. At this point they might not have the R6x0 refresh part out before Nvidia has an entire new generation out in G90, and being a whole generation behind could be a deathblow for DAAMIT as they are also struggling to get Barcelona out and have it be competitive with Penryn.

The big advantage that R600 theoretically had over G80 was the enormous shader transistor budget. G90 is rumored to have a ton of shader power increase over G80 if the 192 SP thing is to be believed, which will negate the only big theoretical advantage R600 had outside of the completely unnecessary and probably expensive excess memory bandwidth. R600 is so much weaker than G80 in basic rendering design such as geometry setup, actual fillrate (even though the 512-bit memory bus and 1024-bit ring bus gives it a massive theoretical advantage), texture filtering, and AA that it's almost unbelievable that DAAMIT would have released it in the state it got released in except that we know that DAAMIT was already 6 months late and had to get something, anything out the door, plus financially DAAMIT is staring into the abyss.

Nvidia designed a powerful, efficient, balanced architecture with G80 and they can easily ride it for the next 2 years just with die shrinks and slapping more SPs onto it and clocking it into the stratosphere. ATI is in a big hole here, not unlike the big hole AMD is staring at with Barcelona versus Penryn, and they need to do something and it needs to be miraculous if they want to catch up with Nvidia now. Being a whole generation behind is the worst thing imaginable in the technology industry.




Damn, where can I find one of them crystal balls :roll:
May 28, 2007 9:32:35 PM

Just to add to what has already been posted: The R600 is a G80 killer since the HD2900XT is clearly a better performing card than the 8800GTS right? The statement was too vague to argue all the fanboys claimed to have a GTX killer with the R600. Hell, the R600 series might come out (purely speculation) with a HD2900XL, so if that falls behind the 8800GTS and it's priced at - let's say - $300, then it is a failure as well because the R600 couldn't even kill the GTS let alone the GTX?

Intel had OEMs, but so did AMD. Another non fact posted in the OP.

Saying the GTX doesn't run as hot is another non fact - it's just as hot as the HD2900XT. But again, they are not meant to compete, the HD2900XT competes with the GTS, which it does run hotter than.

And about the whole timing statement; one manufacturer will be later than the competition no matter what. It's not like they talk to each other and ask what date they will release their product so that they can have a same day release. Be it a week or a few months, just as long as they don't miss a generation, there's still going to be competition. Releasing later definitely has its advantages as well as disadvantages and vice versa.
a b U Graphics card
May 28, 2007 9:43:55 PM

What I find funny about this whole thing is, after all the BS stating ATI has always been behind nVidia, as he goes thru his spin and ilk, the conversation "his conversation" turns into the GTX. Hmmm. So lets see, Ill bet anyone here that the GTX makes less money for nVidia, as fine a card as it is, than the midrange will make for ATI. The OP has to get out his hatred for a company, thats obvious. Just as obvious is the fact he sounds like a fool when it comes to sales and yes even marketing. Lets all call this thread intertainment shall we? heheh
May 28, 2007 9:49:34 PM

Pretty much 90% of the posts here can be purely defined as entertainment. :lol: 
May 28, 2007 9:50:18 PM

Quote:
Just to add to what has already been posted: The R600 is a G80 killer since the HD2900XT is clearly a better performing card than the 8800GTS right?

I think that depends on which review you read, what they test, etc. The 8800GTS still frequently beats the 2900XT once you turn AA on. Who really buys a high-end card to not turn AA on? So I would say the answer is a definite "maybe, maybe not".
May 28, 2007 9:58:42 PM

Quote:
I think that depends on which review you read, what they test, etc. The 8800GTS still frequently beats the 2900XT once you turn AA on. Who really buys a high-end card to not turn AA on? So I would say the answer is a definite "maybe, maybe not".


I'm referring to the latest review from xbitlabs. It is the most current review, so until other more current reviews start unfolding, I will base my assumption from that. But, you are correct, it is still a maybe/maybe not scenario. I'm going to be out on a limb here and predict when all is said and done, the HD2900XT will sit equal distances between the GTX and GTS - on average (since we see that the XT has taken a couple title in a small handful of games form the GTX.
May 28, 2007 10:18:49 PM

I own a 2900XT and original poster's analysis is well reasoned. I've had ATi cards since the 9700 pro and I don't see anything that I disagree with here. Obviously these are not facts--this is discussion and analysis, but this does reflect my experience with the 2900 up to this point. It takes horrible performance hits with AA and AF enabled even in older games.

If you want to really get an idea of how the card performs, look at ONLY the benchmarks that have anti-aliasing engaged. If you look at the raw frames per second benchmarks with no anti-aliasing, you will be misled. The 8800 series generally, across the board, allows for more image quality settings at playable frame rates than the 2900XT.

I used to think that if a card was getting 90FPS in a game then it had "enough power left over for more AA" but that isn't the way it works really. There is, or should be, dedicated hardware onboard for the AA resolve. This is what is known as the "back end". The back end on the 8800 cards is better, and that is why they perform better with AA enabled--and that in my minds means that they perform better, period.
a b U Graphics card
May 28, 2007 10:21:40 PM

Even with AF?
May 28, 2007 10:28:59 PM

Quote:
*nevermind*
after reading his entire bunch of posts.. id say this guy is again ROBSLI, his trollist pro-nvidia behavior is very easy to spot..


:lol:  I was just thinking the same thing.
a b U Graphics card
May 28, 2007 10:39:55 PM

Without AA and AF the 2900 kicks alot of tail, and I mean alot. And that includes the GTX and the GTS with no AA or AF as well. This issue is the debacle we all want to see answered. Be it in drivers or failure. Ive read that there is improvement on the 7.4 drivers with no loss if IQ, now whether this will hold true with the 2900 and the new drivers, we have what? 2 days to wait
May 29, 2007 8:31:37 AM

Quote:
I think that depends on which review you read, what they test, etc. The 8800GTS still frequently beats the 2900XT once you turn AA on. Who really buys a high-end card to not turn AA on? So I would say the answer is a definite "maybe, maybe not".

Agreed.
The HD2900XT is not clearly outperform the GTS since it has better AA performance & i bought the GTS to play ALL of my games with 16xAA and believe me, i can!(at least for now!).(even with my years old single core OC'ed 4000+ San Diego)
Quote:
*nevermind*
after reading his entire bunch of posts.. id say this guy is again ROBSLI, his trollist pro-nvidia behavior is very easy to spot..

:lol:  I was just thinking the same thing.
Thirded! :lol: 
Quote:
Without AA and AF the 2900 kicks alot of tail, and I mean alot. And that includes the GTX and the GTS with no AA or AF as well. This issue is the debacle we all want to see answered. Be it in drivers or failure. Ive read that there is improvement on the 7.4 drivers with no loss if IQ, now whether this will hold true with the 2900 and the new drivers, we have what? 2 days to wait

Come on John! you are not gonna spend lots of cash on your graphics card to play without AA/AF, are you?
Hope the new drivers bring on the true power of the HD2900XT. :wink:
May 29, 2007 8:34:39 AM

My main gripe with the HD 2900XT is that in spite of the fact that it hit the streets a full six months after the 8800GTX, it still feels rushed.

Drivers are really bad. Buggy. Afterimages when closing or minimizing windows. Solid black lines in the margins of Internet Explorer. Some really bizarre texture errors in games.

Sapphire card came bundled with FutureMark's 3DMark06, but the app doesn't run properly due to a known prob with a .dll file. Easy to fix but still - it's just cheesy for hardware to ship with software that doesn't work with it.

Performance is actually pretty good. Once I got 3DMark06 to run, I scored 9740 marks running it stock with Abit AW9D (i975X), E6600 2.4GH, 2 x 1GB Crucial Ballistix DD2 800, WD 150GB Raptor. With the CPU at 3.2GHz (400FSBx8): score increased to 10830 marks. I haven't tried overclocking the GPU yet, but I'm pretty hopeful my single card setup will top 13,000. I think the overall performance (even with the crappy drivers) is about 90% my 8800GTX. That's very good considering the HD 2900XT costs $100 less.

Overall, the card seems like it has a lot of potential, but it's ruined by crappy driver support. And Catalyst Control Center still sucks. No hardware monitoring (voltages, temps). How lame is that?
a b U Graphics card
May 29, 2007 9:20:41 AM

Quote:
Come on John! you are not gonna spend lots of cash on your graphics card to play without AA/AF, are you?
Hope the new drivers bring on the true power of the HD2900XT.
I agree with you. My statement was more to the supposed lack of performance/fps from the 2900. Without AA its a monster, but as ytou said, who wants without? All these reviews done in soo many ways, showing the 2900s strengths only in some, mostly the weaknesses in others, and an overall in but a few reviews has been confusing to alot of people. It needs a driver upgrade, just for IQ and to level out the consistancy of this card. It will get better, but tthe question everyone is asking is How much?
May 29, 2007 9:33:35 AM

Quote:
and you joined here just so you could post crap, we do not need your kind on these forums. take your trash elsewhere.



I joined here to post my view. Freedom of speech anyone?



hrmm. Apparently you slept through government class too. Free speech applies towards public speaking and not towards membership approved audiences. So long as you do not breach the terms of your membership, to these forums, you are as free as the rest to spread your ignorance.

You obviously have a pair. However, don't expect too many members to give you a "good game" or buy you lunch later on. Not too many of these members are keen on stroking each other for voicing their opinions.
May 29, 2007 10:12:03 AM

can any1 remeber back in the days of the AMD64 vs P4? clearly the AMD64 was better, but did that stop people getting a P4? no, fact is the majority of consumers are idoits, and if they are told it is good they will belive it. Yes nvidia will proberly get a better market share but don't count out AMD's ATI just yet (If you think they can't recover/catch up then you haven't been looking at AMD v Intel's past)

Also it has yet to be seen which mainsteam product is better (where most of the money is made).
May 29, 2007 10:31:36 AM

Clever...perhaps consumerism is dictated by the large group of stupid that makes up majority of the market.

I would like to point out that although stupid governs the herd, that stupid is instigated through marketing skills that AMD lacks. Let's hope their reputation saves enough face with stupid that they aren't put through financial ruin. :( 

Maybe stupid will choose personality over looks?...

I think AMD can continue playing the game if they're savvy enough to keep Intel from castrating them financially, and winning over stupid at the low end. Maybe that's why they go after the server market first and foremost ;) 
May 29, 2007 1:43:09 PM

Quote:
Without AA and AF the 2900 kicks alot of tail, and I mean alot. And that includes the GTX and the GTS with no AA or AF as well. This issue is the debacle we all want to see answered. Be it in drivers or failure. Ive read that there is improvement on the 7.4 drivers with no loss if IQ, now whether this will hold true with the 2900 and the new drivers, we have what? 2 days to wait


What?!? The GTX still kicks 2900XT tail even without AA/AF. You look at the 512-bit interface and 750/1650 clock speeds vs. 384-bit 575/1800. You would expect more from the 2900 XT. Basing its failures on "drivers" is still a poor excuse for ATI. I also saw that it doesn't have UVD while the 2600/2400's would. What in the world is that? Some oversight problem?
May 29, 2007 3:01:21 PM

Quote:
Without AA and AF the 2900 kicks alot of tail, and I mean alot. And that includes the GTX and the GTS with no AA or AF as well. This issue is the debacle we all want to see answered. Be it in drivers or failure. Ive read that there is improvement on the 7.4 drivers with no loss if IQ, now whether this will hold true with the 2900 and the new drivers, we have what? 2 days to wait


What?!? The GTX still kicks 2900XT tail even without AA/AF. You look at the 512-bit interface and 750/1650 clock speeds vs. 384-bit 575/1800. You would expect more from the 2900 XT. Basing its failures on "drivers" is still a poor excuse for ATI. I also saw that it doesn't have UVD while the 2600/2400's would. What in the world is that? Some oversight problem?

you also forgot the fact that GTX's shaders are 2X the speed of ATI's.
its like they're separated from the core to run at double speed.

*edit* wrongly wrote Nvidia instaeath of ATI.
May 29, 2007 3:50:26 PM

Quote:
Just to add to what has already been posted: The R600 is a G80 killer since the HD2900XT is clearly a better performing card than the 8800GTS right? The statement was too vague to argue all the fanboys claimed to have a GTX killer with the R600. Hell, the R600 series might come out (purely speculation) with a HD2900XL, so if that falls behind the 8800GTS and it's priced at - let's say - $300, then it is a failure as well because the R600 couldn't even kill the GTS let alone the GTX?

Intel had OEMs, but so did AMD. Another non fact posted in the OP.

Saying the GTX doesn't run as hot is another non fact - it's just as hot as the HD2900XT. But again, they are not meant to compete, the HD2900XT competes with the GTS, which it does run hotter than.

Ok, let me get started with what you said first...
SHOW ME where the x2900xt kicks the GTS 640's ass with AA/AF turned on (like you're really going to buy a $300+ vid card and NOT turn the AA or AF on, come off of it.) This argument reminds me of my statistics class in college: if you completely ignore data that refutes your claim, you can manipulate your argument and interject (not false) but MISLEADING information based loosely on the same data that would DEMOLISH your own argument.
SHOW ME where the x2900xt is priced competitively with what it is supposed to be competing with (GTS). :?:

The x2900xt is a g80 killer like my cat is a dog. It doesnt KILL anything, just barely competes (very poorly with the drivers out now, may I add).
A kill to me is like the 8800GTX/GTS series that comes out with NO competition, SIGNIFICANT performance increases, and decent drivers at launch (not Vista, mind you, but XP). In fact, it killed so hard that ATI had to stop, take 2 steps back, re-work their product, delay 6 months for launch, and STILL released an under-performing card with buggy, under-performing drivers. 8O

Oh, and to all you fanboys out there: guess what? Driver fixes happen for Nvidia, too, so you're driver performance increases MAY be eaten into by Nvidia's releases.
Finally, my GTS640 with my overclock doesnt get over 67C at full load with STOCK cooling. Like to find a 2900xt that can do that. Hell, I bet even the 2600 cant do that.

I'm not a fanboy, just a glutton for punishment. I dont understand why everyone on here can't just see that ATI has made us all wait for a product that isnt worth the money, and needed about ANOTHER 6 months before they released it because it still doesnt perform where it should. I've said this many times: I owned ATI's all the way to my current card and the only reason why I switched was because I was tired of waiting for ATI to get off the pot and I got a great deal from Newegg at $349 for my GTS640 back at the beginning of the year. I was really hoping that ATI would have had the giant-killer that everyone hyped, instead I was disappointed along with all the rest of you besides the die-hard ATI fanboys who dont understand what I wrote above about statistics. :roll:
May 29, 2007 3:51:44 PM

Quote:
Just to add to what has already been posted: The R600 is a G80 killer since the HD2900XT is clearly a better performing card than the 8800GTS right? The statement was too vague to argue all the fanboys claimed to have a GTX killer with the R600. Hell, the R600 series might come out (purely speculation) with a HD2900XL, so if that falls behind the 8800GTS and it's priced at - let's say - $300, then it is a failure as well because the R600 couldn't even kill the GTS let alone the GTX?

Intel had OEMs, but so did AMD. Another non fact posted in the OP.

Saying the GTX doesn't run as hot is another non fact - it's just as hot as the HD2900XT. But again, they are not meant to compete, the HD2900XT competes with the GTS, which it does run hotter than.

Ok, let me get started with what you said first...
SHOW ME where the x2900xt kicks the GTS 640's ass with AA/AF turned on (like you're really going to buy a $300+ vid card and NOT turn the AA or AF on, come off of it.) This argument reminds me of my statistics class in college: if you completely ignore data that refutes your claim, you can manipulate your argument and interject (not false) but MISLEADING information based loosely on the same data that would DEMOLISH your own argument.
SHOW ME where the x2900xt is priced competitively with what it is supposed to be competing with (GTS). :?:

The x2900xt is a g80 killer like my cat is a dog. It doesnt KILL anything, just barely competes (very poorly with the drivers out now, may I add).
A kill to me is like the 8800GTX/GTS series that comes out with NO competition, SIGNIFICANT performance increases, and decent drivers at launch (not Vista, mind you, but XP). In fact, it killed so hard that ATI had to stop, take 2 steps back, re-work their product, delay 6 months for launch, and STILL released an under-performing card with buggy, under-performing drivers. 8O

Oh, and to all you fanboys out there: guess what? Driver fixes happen for Nvidia, too, so you're driver performance increases MAY be eaten into by Nvidia's releases.
Finally, my GTS640 with my overclock doesnt get over 67C at full load with STOCK cooling. Like to find a 2900xt that can do that. Hell, I bet even the 2600 cant do that.

I'm not a fanboy, just a glutton for punishment. I dont understand why everyone on here can't just see that ATI has made us all wait for a product that isnt worth the money, and needed about ANOTHER 6 months before they released it because it still doesnt perform where it should. I've said this many times: I owned ATI's all the way to my current card and the only reason why I switched was because I was tired of waiting for ATI to get off the pot and I got a great deal from Newegg at $349 for my GTS640 back at the beginning of the year. I was really hoping that ATI would have had the giant-killer that everyone hyped, instead I was disappointed along with all the rest of you besides the die-hard ATI fanboys who dont understand what I wrote above about statistics. :roll:
May 29, 2007 4:29:52 PM

Quote:
...Finally, my GTS640 with my overclock doesnt get over 67C at full load with STOCK cooling.

That's awesome! :D  Cause mine easily reaches 80c at full load.(BTW, it's really hot in here and i`m melting as i`m typing this! :evil:  )
I 100% agree with your comment! :trophy:
!