Sign in with
Sign up | Sign in
Your question

HD2900 XTX Bench'd - Page 3

Last response: in Graphics & Displays
Share
April 26, 2007 11:25:36 PM

Quote:

dont get too happy, Nvidia might get hit with the class action suit because of non working drivers for way too long... :|
It´s just a bunch of angry customers hosting a website rattling the chain. Nothing more, nothing less.
And i´m not happy about what´s happening. Stanley & Morgan used to be a great company, making nice, cheap processors. Right now they aren´t even a company anymore. All the delays, the wicked decisions of their ceo, this charade in Tunisia etc. pp... it all adds up and in the end i really have to question if they are still professionals in a buisness or some kids playing with the big boys. I hope that Stanley & Morgan at least get the midrange 2600 cards to market without further delay and at a reasonable price. And i mean hope in the truest sense of the word since every statement AMD made in the last few months was about as trustworthy as Bagdhad Bob. :?
April 26, 2007 11:42:41 PM

Quote:
I think that part of the problems that the guys over at DailyTech are having with the XTX card, is that it's the older 80nm OEM version of the card. Remember that DAAMIT did a die shrink of all their new HD cards down to 65nm, and supposedly that's what the delay is all about.

The OEM version is not the final working product, and I seriously doubt that DAAMIT would even consider releasing the XTX card, if it's only 2% faster, and only in some games than the XT card.

http://www.dailytech.com/ATI+Radeon+HD+2900+XT+Performa...

Here's the Radeon HD 2900XT doing swell. If the real XTX would please step forward...

I'd say give it time...these pre-release benchmarks are often a bit premature. Driver maturity as well as finalized products will tell all. I'm sure it'll be a close one for the real HD 2900XTX and the 8800Ultra cards.


A die shrink would not/will not increase performance. The only hope for ATI fans is that the drivers suck. But considering how long ATI has had to work on them, it's unlikely that's the issue.

Plain and simple: the card sucks. Call it ATI's "FX 5900" of the cycle.

we may never know, penryn was suposed to be only a shrink, but the preview by anand...shows it got improved a lot....

Quote:
jaydeejohn, a couple of things. you speak of onboard sound when AFAIK, their is not a sound chip on the ATI cards only a audio pass through chip.

you also mention a PPU when i thought it was the same as the x1xxx series in that the gpu IS the PPU in that physics is run instead of gfx.

where are you getting this info?

also, to everyone, is it just me or is that "stream processor" thing completely and utterly rubbish. i thought ATI went a completely different way than nvidia and that the phrase stream proc was used by nvidia soleley for their 8800's series.

if it is rubbish then these guys over a DT surely look like complete noobs for not having a clue about a card they are supposedly testing or ATI someone how managed to make a card using the same tech as nvidia did.

i think they have no clue frankly.


Maybe you should try to look info about the realtek thingie...
most websites says it is indeed a SOUND CARD, but not a extremely good or similars.. just one enought for the HDvideos sound.
April 26, 2007 11:47:11 PM

Quote:

Here's something for you to consider.

The workstation market sells about 50-100% more units at about 100-200% more revenue per quarter, than the entire high end market ($250+) combined.

Noting that, what do you think of the fact that the supposedly $400 HD2900XT beat the QFX5500?

Right now as an OEM workstation part, the R600 looks like a Monster, with the plain unoptomised XT destrroying the $2,000+ Quadro FX5500 by about 50% (27+% in OGL 3DSmax which Ati never did well) .

No FireGL has had that kind of success even with multiplle driver updates and moving/consolidating driver teams from Germany.

And think that was the 512MB XT against the 1GB FX5500 not the 1GB and faster bandwidth XTX with non-workstation drivers.

So the question is what if the $400 XT beats the $4,000 1.5 GB version of the QuadroFX5600 (which it looks like it will by http://www.nvidia.com/object/IO_11761.html" target="_blank">nV's own tests[/url), it sounds like ATi is about to take the biggest lunch available out of all this with a single card.

I think for people like Cleeve, Crashman and the few others that do some serious 3D tinkering, they're going to buy the HD2900 because it does everything, even if it may game a little less.

Of course I'm making lemonade out of lemons, but considering you're talking about a GF8900 card, I think I have license for a little Hyperbole. :twisted:

OMG, nV's Quadros are DOOMED !!!

Personally I'll just wait for the official benchies though, eh!?! :wink:


Okay, next time i will remember to set the sarcasm flag to 1 so your post interpreter won´t screw up. :wink:

Generally i´m happy to see that AMD does make some money at least somwhere. While i believe you and your argument, you forget the effect of the high end GPUs to the mid and low-end consumer market. I can´t state numbers or link statistics of it, but it sounds reasonable to say that there are people that buy ATI or Nvidia mid or low-range cards based on the simple fact that one of the companies has the fastest card in the high end segment. And as i said in other posts, the real money is in the mid and low range and ATI used to be quite good at that. The prices for the 1950 Pro are amazing. With the horrible 8600 they have a good chance to keep that market with ease - well, unless the magic delay shows its ugly head again or they screw up the drivers, or they price it totally wrong like the 1650XT, or they... well, delay it.
That´s not the point of my exaggerated and sarcastic statement though.

The benchmark seen in dailytech is exactly what AMD deserved. After multiple delays, lots of rumors and no official info, they started to tie up every journallist that came within 100km of their mystical R600 chip or threatened them at gunpoint to sign a NDA. I was at the Cebit and asked about their new gpu - i consider myself lucky that i wasn´t taken behind hall 30 and shot by the Gestapo.
What´s wrong with AMD? They used to be different all outspoken about what they do and where the company is headed. Now they act like the big blue one they used to attack for it´s behaviour.
a b U Graphics card
April 26, 2007 11:49:24 PM

Quote:

A die shrink would not/will not increase performance.


Why not? Can you explain this?
If it's a leaky shrink then I would agree, but the 65nm is not a half node optical shrink like the 80nm, so a shrink can (but doesn't guarantee) better power consumption, less leakage, and better speeds. I just don't understand how you can say emphatically that it will not increase performance, especially if the current 80nm suffers from leakage problems.

Quote:
The only hope for ATI fans is that the drivers suck. But considering how long ATI has had to work on them, it's unlikely that's the issue.


Well considering nV's history would you expect how bad their drivers sucked for so long considering how long they had their part and how long they had to optimize? I don't doubt that there's still room to optimize any pre-launch drivers from either company. Most company need the early adopters to sort out their drivers, since their small beta test teams just can't do enough of the work on their own, only once the cards have been in the wild for a while do they start making significant stride to the performance of the cards. Can you tell me a single card that didn't get at least 1 huge performance boost since it's launch? I remember every series geting at least one, and most often with ATi it's early driver improvements are affecting internal memory management. I don't doubt that we're a long way from even 90% performance. I bet nVidia iself is only now nearing that target, if that.

Quote:
Plain and simple: the card sucks. Call it ATI's "FX 5900" of the cycle.


I wouldn't say that until we see nV's DX10 performance. Sofar the geometry power of the G80 looks really weak (supported by those DT SpecViewPerf results). The G80 looks to be a great DX9 card, like the FXs were solid DX8 card, but it's not looking good for their DX10 performance, just like the FX's DX9 performance. Most people who bring up the FX reference in regard to the HD2900 should remember what the problem was with the FX series, bad drivers and bad future performance. That's more similar to the G80 it seems than the HD2900 IMO.

To me the HD2900 if anything looks like the X1800, weaker than it's competitor at launch, but with more features, and a little more longevity.

Of course as usual, only time will tell what the whole story is.
April 27, 2007 12:00:18 AM

Quote:
I wouldn't say that until we see nV's DX10 performance.


I can agree on that even though dx10 doesn´t matter right now. Before DX10 games make up more than a minority of the available games a new GPU generation will probably be available. While we will see a few DX10 titles, 2008 will be the year of DX10, not 2007. Just look at how long games took to switch from 7 to 8 or from 8 to 9. A lot of games still use DX8 renderpaths or at least support them. And i´m not talking about simply using it but about using the added features of the newer version. What i´m trying to say is, that the importance of DX10 is simply overrated.

Quote:

Sofar the geometry power of the G80 looks really weak (supported by those DT SpecViewPerf results). The G80 looks to be a great DX9 card, like the FXs were solid DX8 card, but it's not looking good for their DX10 performance, just like the FX's DX9 performance.

I read that too in some interview i think. I can´t remember where though, but i tend to believe it.

Quote:

To me the HD2900 if anything looks like the X1800, weaker than it's competitor at launch, but with more features, and a little more longevity.

That´s what i thought too. The X1800 wasn´t bad but was overshadowed by the competition at first and then by the x1900 follow up model.
April 27, 2007 12:12:38 AM

Quote:
I wouldn't say that until we see nV's DX10 performance.


I can agree on that even though dx10 doesn´t matter right now. Before DX10 games make up more than a minority of the available games a new GPU generation will probably be available. While we will see a few DX10 titles, 2008 will be the year of DX10, not 2007. Just look at how long games took to switch from 7 to 8 or from 8 to 9. A lot of games still use DX8 renderpaths or at least support them. And i´m not talking about simply using it but about using the added features of the newer version. What i´m trying to say is, that the importance of DX10 is simply overrated.

Quote:

Sofar the geometry power of the G80 looks really weak (supported by those DT SpecViewPerf results). The G80 looks to be a great DX9 card, like the FXs were solid DX8 card, but it's not looking good for their DX10 performance, just like the FX's DX9 performance.

I read that too in some interview i think. I can´t remember where though, but i tend to believe it.

Quote:

To me the HD2900 if anything looks like the X1800, weaker than it's competitor at launch, but with more features, and a little more longevity.

That´s what i thought too. The X1800 wasn´t bad but was overshadowed by the competition at first and then by the x1900 follow up model.

many people were still preety happy with their Direct9 performance on Radeon 9700's, and not everyone changed instantly to the 9800's!
a b U Graphics card
April 27, 2007 12:14:14 AM

Quote:

While i believe you and your argument, you forget the effect of the high end GPUs to the mid and low-end consumer market. I can´t state numbers or link statistics of it, but it sounds reasonable to say that there are people that buy ATI or Nvidia mid or low-range cards based on the simple fact that one of the companies has the fastest card in the high end segment.


True, I'd call that the BestBuy/CompUSA effect, and IMO that's what's currently got the GF8600s prices remaining high for the next week or so.

Quote:
And as i said in other posts, the real money is in the mid and low range and ATI used to be quite good at that. The prices for the 1950 Pro are amazing. With the horrible 8600 they have a good chance to keep that market with ease - well, unless the magic delay shows its ugly head again or they screw up the drivers, or they price it totally wrong like the 1650XT, or they... well, delay it.


I agree, although a delay wold be of minimal impact if the performance is worthy. Even a June delay puts the GF8600s at a 5-6 week advantage, which isn't huge, and reallyy if the performance is 10-20% better than the GTS at a lower price, then the lower cost of the 65nm part should make it very profitable (one of the nice things about the X1950Pro and X1950GT is that they are dedicated small trans count 80nm parts not crippled higher trans 90nm parts). So the mid range battle isn't even started yet, IMO, the GF8600 doesn't even start becoming compelling until the FSX, SupremeCommander and COH patches come out that add DX10. That's when the mid-range market becomes vital, since the cheap GF7900GS and X1950s aren't going to be impressive enough and the GF8600 and HD2600 DX10 checkboxes finally have a worth. If AMD delays more than a week or two beyond that, then they give up a huge buying period IMO, that's when the delay matters, until then IMO it's just inconvenient (and hey I'm in that boat, only worse, there's NO mobility parts from either of them yet).

Quote:
The benchmark seen in dailytech is exactly what AMD deserved.


Why would they deserve a benchmark that wouldn't be aaccurate, that would almost be like fudgin the benchmark because they did something we like. The consumer is hurt either way. I understand your frustration (see above), however I don't think anyone deserves anything other than accurate, fair and unbised reporting. If that's what DT attempted to do, then fine, but considering the words you read in this thread questioning the validityof the benches, and consideirng their omission of many of the tests from the previous XT benchies, then I question the methedology, of something that is admitedly a 'Blog' and not a review, but which holds as much responsability if promoted on the site's front page. I still don't get the deserve if it's on purpose. If you're saying that ATi deserves to have bad info leak since they tie up all the good info, then maybe there's a point there, but do we deserve that bad info to begin with?

Quote:
What´s wrong with AMD? They used to be different all outspoken about what they do and where the company is headed.


If you read the InQ article about their experiences in Tunisia, they report the exact opposite, that the once loveable ATi has been co-opted by AMD;
http://www.theinquirer.net/default.aspx?article=39176
So I guess it all depends on your past relationship with each as to whether this new worse than both were (if you're both right) is the cause of one or the other or simply an organic mash-up from both cultures that produced a paranoid culture.
April 27, 2007 12:18:17 AM

Quote:

many people were still preety happy with their Direct9 performance on Radeon 9700's, and not everyone changed instantly to the 9800's!


This is quite subjective since i base it on my personal experience and what i´ve seen here and there but to me it looks like Nvidia develops its cards with the current and upcoming game titles in mind and adds future proving features only as a gimmick to be printed on the box to improve sales. ATI used to do quite the opposite and developed its card with future titles in mind. I wouldn´t be surprised if its the same with the current generation of cards. Heck i wouldn´t even be surprised if the X2800/2900 still performs quite good in Dx11 titles two or three years from now.
April 27, 2007 12:48:54 AM

Quote:
True, I'd call that the BestBuy/CompUSA effect, and IMO that's what's currently got the GF8600s prices remaining high for the next week or so.

I´ve already seen people order them. The 8600 GTs costs about 175€ while the way better 1950XT is available for 180€. I can´t see the logic behind a buy like that unless future proofing or brand loyalty comes into play - well or, as you called it, the bestbuy-effect.

Quote:

I agree, although a delay wold be of minimal impact if the performance is worthy. Even a June delay puts the GF8600s at a 5-6 week advantage, which isn't huge, and reallyy if the performance is 10-20% better than the GTS at a lower price, then the lower cost of the 65nm part should make it very profitable (one of the nice things about the X1950Pro and X1950GT is that they are dedicated small trans count 80nm parts not crippled higher trans 90nm parts). So the mid range battle isn't even started yet, IMO, the GF8600 doesn't even start becoming compelling until the FSX, SupremeCommander and COH patches come out that add DX10. That's when the mid-range market becomes vital, since the cheap GF7900GS and X1950s aren't going to be impressive enough and the GF8600 and HD2600 DX10 checkboxes finally have a worth. If AMD delays more than a week or two beyond that, then they give up a huge buying period IMO, that's when the delay matters, until then IMO it's just inconvenient (and hey I'm in that boat, only worse, there's NO mobility parts from either of them yet).

Indeed. I hope AMD manages to get the mid-range out without delay. If it´s really manufactured in 65nm they finally have an advantage. I don´t really see how they can screw this up given the bad price/performance of the 8600 series. The only thing that could happen is that their new x2600 has to compete with their own cheap 1950XT/Pro.
I´m not too sure that a small delay isn´t a problem though. AMD has had way too many delays. It´s tarnishing their image. It´s no excuse that AMD and even ATI both have always been late. There are companies that can deliver on time and in the OEM market that is something companies appreciate.

Quote:

Why would they deserve a benchmark that wouldn't be aaccurate, that would almost be like fudgin the benchmark because they did something we like. The consumer is hurt either way. I understand your frustration (see above), however I don't think anyone deserves anything other than accurate, fair and unbised reporting. If that's what DT attempted to do, then fine, but considering the words you read in this thread questioning the validityof the benches, and consideirng their omission of many of the tests from the previous XT benchies, then I question the methedology, of something that is admitedly a 'Blog' and not a review, but which holds as much responsability if promoted on the site's front page. I still don't get the deserve if it's on purpose. If you're saying that ATi deserves to have bad info leak since they tie up all the good info, then maybe there's a point there, but do we deserve that bad info to begin with?

I don´t consider the dt benchmark to be exact - heck, i don´t even consider it as a guideline as to what to expect. They used the official OEM driver, a driver that is probably made with a maximum of stability in mind and is the first for a brand new product. In a few weeks new drivers will improve performance and thorough testing and benchmarking will reveal the abilities the new chip has or doesn´t have.
What i mean with "they deserve this" is, that it could´ve been prevented easily. If AMD had realeased a benchmark themself it would´ve been way better. People would´ve screamed foul play and everything, but all those benchmarking sites would´ve had a rough guideline for what to expect. Now everyone is benching and some get bad performances. Since they have no references to look at, they can only repeat their benchmarks maybe try to tweak here and there. If AMD had released a benchscore showing the 8800GTS having 78 frames in oblivion and the 2800 having 120 DT would´ve questioned its own benchmark. Now, thanks to AMDs rather restrictive, or as you call it paranoid, information policies they are getting what they deserve.
We, as customers or, in our special case, enthusiasts do deserve better, sure, but i won´t cut AMD any slack for it´s own mistakes and i really believe that DT got those numbers. Maybe something is wrong, maybe something isn´t. I won´t pay for AMDs mistake nor will any other enthusiast that has two eyes and a working brain. I hope though that it will damage their sales enough to make them reconsider and change their way of handling information.
a b U Graphics card
April 27, 2007 1:28:49 AM

OK, I understand what you're saying, the only problem is that you know that neither AMD nor ATi would ever publish bad benchies so any that they do release, just like the Official nV GF7800 and ATi X1800 ones would be met with rolling eyes and immediately discounted. Should DT's numbers come out, no one would question DT because of ATi/AMD's benchies, they would do the opposite and question ATi/AMD's benchies because of DT's. It's a lose lose situation for them, they don't get respectability by launching the benchies but no part, that say that it's ready, but where is it, and then you get the FUD charges (heck that's why they launched the old numbers was so people would stop buying the other guy's parts just before the launch).

I've got some of those old 'leak becnhies' somewhere, you may rememebr them, but you know we all rolled our eyes at them.

The me the annoying part is that DT's benchmarks aren't even internally consistent, and once again Anand only shows part of the story, when any mediocre enthusiast is left with basic questions, that expose something bigger. Where's the Workstation benchmarks this time, why no 3Dmarks to compare the scaling from XT to XTX? Why use an overclocked GTX? The last one alone is just a head shaker.

While this isn't as bad as those early leaks that only give 3Dmarks (as if a final scores matters half as much as the make-p of the individual results within), I still find it only slightly more usefull.

I agree with you that AMD brought this on themselves somewhat, but I won't condone the mediocre methods of Anand's crew which notoriously omits tests within a review that would show a bigger picture that doesn't fit their conclusions. If this were anyone else I might even cut them more slack, but this is not new for DT/Anand, and I thought they knew better by now.

If you had the card and you were benching think about the basic tests you'd run in a 24hr period between getting it an posting your results. Would that limited selection really be all you posted? I know I wouldn't. I'd have at least on low-res to ultra high-res bench for both to show scaling (something consisten and stressful, even like rthdribl), shadermark/rightmark, then popular games, then since they already had them in the other bechmark, SpecviewPerf and 3Dmark. I'd also see what happens when I run nV's DX10 demos to see if they're agnostic code or if ATi's drivers can't run it.

Anywhoo, just compared to what I'd do with them, these two benchies by DT are pretty weak on the information side of things and seemed more concerned about getting the info out fast rather than getting out quality info.
April 27, 2007 1:35:43 AM

I'm sure someone will start a new thread under the wrong section for this but it doesn't add enough new info to not just get added here.

A little more from Dailytech re. their r600 benches.

DailyTech Digest: Making Sense of "R600"
a b U Graphics card
April 27, 2007 1:43:08 AM

It almost seems like that's a foot note to say they're done testing. :?
April 27, 2007 1:55:29 AM

It is probably as much an attempt to answer the questions they've been bombarded with. They're teaser benches probably generated some "interesting" feedback.
April 27, 2007 2:03:56 AM

Quote:
OK, I understand what you're saying, the only problem is that you know that neither AMD nor ATi would ever publish bad benchies so any that they do release, just like the Official nV GF7800 and ATi X1800 ones would be met with rolling eyes and immediately discounted. Should DT's numbers come out, no one would question DT because of ATi/AMD's benchies, they would do the opposite and question ATi/AMD's benchies because of DT's. It's a lose lose situation for them, they don't get respectability by launching the benchies but no part, that say that it's ready, but where is it, and then you get the FUD charges (heck that's why they launched the old numbers was so people would stop buying the other guy's parts just before the launch).

I know and fully understand that. A benchmark by amd would´ve been worthless for the ordinary audience. As you said, they would never make their product look bad. But when you make a benchmark and it differs greatly from what the manufacturer released it should make you think. Sure one can think "they tweaked it" but an objective person that takes even a slightly scientific approach to the problem will have to question not only the official benchmarks but the self-made ones too. DT failed at that - but that wasn´t a surprise.

Quote:

The me the annoying part is that DT's benchmarks aren't even internally consistent, and once again Anand only shows part of the story, when any mediocre enthusiast is left with basic questions, that expose something bigger. Where's the Workstation benchmarks this time, why no 3Dmarks to compare the scaling from XT to XTX? Why use an overclocked GTX? The last one alone is just a head shaker.

I don´t consider the workstation benchmarks as really important but they shouldn´t miss in a thorough and objective review of the card. I can handle a few shortcommings here and there, but using the GTS OC as a reference was not bad, it wasn´t a mistake or glitch - it makes the whole review pointless.
They have to use a reference to show the strenghts and weaknesses of a new card and the choices of picking a reference are quite limited. Taking a card with a comparable price isn´t a good choice, yet sadly a common mistake. Prices vary and what would be a good reference in the USA would be a catastrophe in Portugal or Japan. That´s especially sad as DT, as an internet page, caters to an international audience.
I could´ve tolerated it (even though just barely) if they had taken the nvidia reference GTS or a card build after the nv reverence model. Yet, they take some XXX Pro Plus OC special edition that isn´t even available in half the western world. To claim that card A is faster or slower than a specially tweaked, hand manufactured, water cooled special edition (blessed by the dalai lama) that´s only available in some backdoor shop in madagascar is a worthless statement. A good and scientific approach would´ve been to use the fastest standard card currently available AND the most common mainstream card sold at this time.
After i read about the OC card i just glimpsed at the benchmark results. Maybe they are right, maybe they are not - i could actually ask some gypsy to throw some chicken bones to tell me about it.

Quote:

While this isn't as bad as those early leaks that only give 3Dmarks (as if a final scores matters half as much as the make-p of the individual results within), I still find it only slightly more usefull.

I´ve heard there is a mathematical function somewhere on the net that can be used to determine girth and length from a users 3dmark score alone... 8)

Quote:

I agree with you that AMD brought this on themselves somewhat, but I won't condone the mediocre methods of Anand's crew which notoriously omits tests within a review that would show a bigger picture that doesn't fit their conclusions. If this were anyone else I might even cut them more slack, but this is not new for DT/Anand, and I thought they knew better by now.

That´s why most people read more than a single review. Actually it´s quite sad that there is not a single site where a reader can be sure that testing is thorough and objective. I´ve to admit that they didn´t have time to go into great detail, yet, that´s the field they are working in and their results don´t look quite professional.

Quote:

If you had the card and you were benching think about the basic tests you'd run in a 24hr period between getting it an posting your results. Would that limited selection really be all you posted? I know I wouldn't. I'd have at least on low-res to ultra high-res bench for both to show scaling (something consisten and stressful, even like rthdribl), shadermark/rightmark, then popular games, then since they already had them in the other bechmark, SpecviewPerf and 3Dmark. I'd also see what happens when I run nV's DX10 demos to see if they're agnostic code or if ATi's drivers can't run it.

Anywhoo, just compared to what I'd do with them, these two benchies by DT are pretty weak on the information side of things and seemed more concerned about getting the info out fast rather than getting out quality info.


Indeed. I´ve got the feeling they are presenting a conclusion rather than objective data. Maybe it would´ve been a better preliminary article if they had thrown in some of the rough data they collected during benching and not only the end results.
Anonymous
April 27, 2007 2:24:55 AM

kiss ass. thats why i always buy Video card from Nvidia. CPU from INTEL. and sound card CREATIVE.
April 27, 2007 2:32:56 AM

Quote:
kiss ass. thats why i always buy Video card from Nvidia. CPU from INTEL. and sound card CREATIVE.


I hope you're seriusly joking?
creative is a total joke.. they dont even support EAX and their programs dont work on vista for the older cards, in short words.. all audigy are NOW UNSUPORTED.. >_<

Quote:
I'm sure someone will start a new thread under the wrong section for this but it doesn't add enough new info to not just get added here.

A little more from Dailytech re. their r600 benches.

DailyTech Digest: Making Sense of "R600"


looking at these images...
it makes me almost say "score.. they're using OEM VERSIONS!!"
not retails.. so pfffffffff
April 27, 2007 2:35:14 AM

Quote:
kiss ass. thats why i always buy Video card from Nvidia. CPU from INTEL. and sound card CREATIVE.


I hope you're seriusly joking?
creative is a total joke.. they dont even support EAX and their programs dont work on vista for the older cards, in short words.. all audigy are NOW UNSUPORTED.. >_<

Which is why I have a X-Fi :wink:
April 27, 2007 2:44:27 AM

Quote:
kiss ass. thats why i always buy Video card from Nvidia. CPU from INTEL. and sound card CREATIVE.
I bet your Geforce FX screamed with your netburst for gaming. :lol: 
a b U Graphics card
April 27, 2007 2:58:08 AM

Quote:
I bet your Geforce FX screamed with your netburst for gaming. :lol: 


LOL!
April 27, 2007 3:17:07 AM

2900XT + GDDR4 + increased clocks = 2900XTX

Is it not? XD

why would the addition of more ram, higher ram clock speeds and higher core clock lead to lower performance? there is some thing were missing. my money is on the drivers. when the drivers have been optimized to the point of using those 320 stream processors efficiently, and games that will require more graphics memory; the HD2900XTX will shine and take NVIDIA for a walk down the green mile :twisted:
April 27, 2007 3:22:18 AM

Cue the 8900Ultra or whatever.

I know this has been said over and over but whatever ATI has coming, it has to better than the G80 and it's refresh that inevitably is going to be released right after the R600 series comes out.

Quote:
While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it.


Take all the current benchmarks with a grain of salt.
April 27, 2007 4:01:43 AM

Quote:
2900XT + GDDR4 + increased clocks = 2900XTX

Is it not? XD

why would the addition of more ram, higher ram clock speeds and higher core clock lead to lower performance? there is some thing were missing. my money is on the drivers. when the drivers have been optimized to the point of using those 320 stream processors efficiently, and games that will require more graphics memory; the HD2900XTX will shine and take NVIDIA for a walk down the green mile :twisted:


The benchmarks DailyTech performed yesterday utilized release candidate drivers. Today's tests used retail drivers ATI released to its board partners.

The less than stellar performance benchmarks are no surprise to board partners and AMD insiders. Two independent ATI board builders told DailyTech that Radeon HD 2900 XTX will not be a part of their initial portfolios. Given the additional memory cost and bulky card footprint, it seems unlikely the XTX will ever see the light of day.


The above from DailyTech's latest article.

Hard stuff from a mainstream site's "news" side, and hard to get away from if it is proved wrong.
April 27, 2007 4:04:50 AM

Wow there's a lot of rationalizing going on here. I, for one, am sticking with the assessment I've had since the March delay and the absurd Bush-level "stategerie" excuse they gave at the time. The R600 is a NV30 level disaster. Crazy late, power guzzling, and underperforming.

The hype was always better than the reality, so they pushed out the hype as long as possible. There was and is no good reason for the NDAs or the total lack of clear cut statements like "The R600 will own the GTX". They had nothing to fear about cannibalizing their own products since they have nothing atm in the $300+ range anyway. The only reason for the NDA at CeBit was that they had very little good to show. The only reason for the NDA at Tunis was that they had very little good to show. That's it, just a flop, and too bad since AMD needs the help right now.
April 27, 2007 6:13:21 AM

how long are the nda's supposed to be kept anyhow? it can't be more than a week
April 27, 2007 6:22:01 AM

crap...
April 27, 2007 6:26:57 AM

Quote:
While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it.


Well, this, as you can see, is my first post but i have something to say.

Firstly, we should pay attention to what Burn has to say. If he has signed a NDA and is saying that DT benches are off, so they must be.

Secondly, nvidia may know how good this card is, or they wouldn't be making the 8800Ultra. This likes some kind os strategy from amd, someway to surprise us and make us get crazy about this card.

We must not forget that all this is increasing publicity on this card. And we must never forget the old saying adage: talk bad, but talk about me".
April 27, 2007 6:39:24 AM

I have been told that nvidia and AMD/ATI are having problems with digital rights on vista i think or maybe it was vista and xp. but seriously what do u guys speculate the problem could be with the XTX? drivers? GDDR4? DX9? ATI have always had solid products. I really wanna see ati kick the absolute $hit out of nvidia.
a b U Graphics card
April 27, 2007 6:47:45 AM

The problem is, we will have to wait til May 2nd to get the real picture. As to the drm... problems? For drm? nawwwww. Seriously, thats the first Ive heard of it, but from what Ive read, the newest drivers (8.37 I believe) will address this
April 27, 2007 11:35:53 AM

Quote:
I bet your Geforce FX screamed with your netburst for gaming. :lol: 

LOL, good stuff.
a b U Graphics card
April 27, 2007 11:49:53 AM

Quote:
It almost seems like that's a foot note to say they're done testing.

Seems to me to be slightly slanted to make up for and defend their testing; with a little bit of (self)proclaiming themselves to be hacks on top. I think they know the card will launch test at least somewhat better than they showed, and thus attempted to save face with this blip. Maybe the NDA guys at Anand clued them in a bit. Oh well, next Tuesday we should know alot more.

PS. Only showing benchies of 100+ fps for the 8800GTX in Oblivion and such different results in the first test....I'll say the "hacks" nickname fits. I would have been embarrassed to publish those results, especially the day after your bud posted such different (and more meaningful) results in the same game (HD X2900XT) :wink:
a b U Graphics card
April 27, 2007 12:10:59 PM

I pointed out in an earlier post, that everyone that believes these benches to be absolute, from one day to the next, it better n better, and maybe if they kept going the 2900 would beat the GTX heheh
Anonymous
April 27, 2007 12:25:09 PM

it was screaming less then your onbaord video card with AMD sempron
April 27, 2007 12:50:40 PM

Quote:
it was screaming less then your onbaord video card with AMD sempron
Are you calling my 8800GTX onboard video? Or do you mean that the 9800XT I was using(which would completely have wiped the floor with any Geforce FX card out) onboard video? :lol: 
April 27, 2007 12:54:07 PM

I can't believe I waited 5 months for these horrible numbers. The horror! The horror!
Anonymous
April 27, 2007 1:10:24 PM

i also have a 8800 GTS. but lets go to serious discussion. after reading also about 2900XT which will against 8800gts. i don't think that 2900XTX will be able to beat 8800 GTX because it has same number of stream processors at XT just higher clock where as 8800 GTX has more processors then GTS with higher clock. even if it does it won't be able to beat 8800 ultra.
a b U Graphics card
April 27, 2007 1:23:14 PM

In some games the HD2900XT was even with the 8800GTS 640MB. But it easily won a couple of them like Oblivion. I wouldn not be surprised if the HD 2900XT could match a reference 8800GTX in the important demanding foliage areas of Oblivion. That would be important to me if the prices stay far apart. (I'd love an 8800GTX but won't pay $550+ for one. ) Other games the 8800GTX will probably take easily. But we have to wait and see how valid any of DT's benchies are.
Anonymous
April 27, 2007 1:28:42 PM

how does the 8800GTS 320 do in Oblivion. at 1440x900.
April 27, 2007 2:32:02 PM

Well the most telling point in the article is the last paragraph. ATI's board partners are not surprised at all. My theory is that AMD has known this since the release of the G80 and have been trying to correct the problem (hence the delays). Knowing that it couldn’t be done (at least for now), they changed directions and are now going after the HTPC consumer. This sucks for the ATI fan boy but all-in-all it is a strategy that is probably most tenable given the situation that they are in.
April 27, 2007 3:19:07 PM

My post was ignored. I would like to know, where can I find out what the card has that the 8800 does not? I read somewhere it can do 24x anti-aliasing, which I really really really want, even if it isn't as powerful, since it will make all of the games I already play look so much better. I was blown away by what 8xAA does for 640x480 locked 3d apps. Also, I've been using a Geforce 7800GT for quite some time now, and I'm disappointed on what I've missed out on by going with Nvidia. The graphics quality isn't as nice, and I prefer the features that ATI includes in their drivers.

Maybe I'm being fanboyistic, but I'd really like to support ATI this round, for all these reasons, and I'd like to support the competition.

Plus, I'm hopeful of what ATI can do in DX10, and with all the NDAs going around, I'm sure they've got something that they don't want Nvidia to emulate before they get a chance to establish. Then again, they might just not want the consumers to know how bad the card is, but I doubt that.

Also, whats the deal with the great performance in workstation computing? That was unexpected, but I maybe I should have since they wanted to use the same architecture for workstation cards like the GPGPU versions.
April 27, 2007 4:59:49 PM

There is only one thing we can do as consumers.

Wait until both cards are on the table.

Then buy the best for our money.

Don't look back.


Never ever ever ever ever buy something based on how you would WANT it to perform.

We as consumers must drive the point home with our money, AMD must make a better CPU and ATI must make a superior GPU.


Survival of the fittest, only the BEST must be allowed to survive.
a b U Graphics card
April 27, 2007 11:26:31 PM

Quote:
how does the 8800GTS 320 do in Oblivion. at 1440x900.

Probably very, very good. But don't expect that to mean you can crank every in game setting to max, run texture mods, and run high FSAA&AF all at the same time. You will still have to tweak some things a bit.

I can bring my 320MB GTS to a crawl in Oblivion at 1680x1050. I do find it very playable at that res with some mods and with near max settings (except self shadows, grass shadows and shadow filtering) and 2xaa/16xAF. 4xAA at those settings the GTS would need to be OC'ed as at stock it will drop into the teens in spots.
May 1, 2007 7:04:27 AM

Quote:
(signing Non-disclosure agreements like the Inquirer)

Those guys at the Inquirer seem to be quite proud of NOT signing NDA's :?:
May 1, 2007 1:15:31 PM

So far it looks like this:

1) Entire launch of R600 series all at once is now blown to pieces.

1) 2900 XT (finally) release May 14th instead of Feb/Mar/April.

2) 2900 XTX possibly weaker than the 8800 GTX (OCed) and delayed till Q3 07. Rumor is DDR4 RAM is slower than the DDR3 on the XT.

3) HD 2600 series are delayed till June because of 65nm ramping issues.

4) HD 2400 series delayed till late June b/c bug in their 65nm process with some 3D applications not working at all.
May 1, 2007 7:15:26 PM

Quote:
sources?

I hate to say it but Fudzilla and Dailytech. Then again, I thought Fudzilla was leaning towards AMD with its awesome Barcelona OCing SS's
May 1, 2007 9:45:17 PM

Quote:
(signing Non-disclosure agreements like the Inquirer)

Those guys at the Inquirer seem to be quite proud of NOT signing NDA's :?:Also note that they've had very little R600 news lately.
May 7, 2007 9:46:32 AM

Quote:
how long are the nda's supposed to be kept anyhow? it can't be more than a week

Yeah, I wanted to know the same thing. When is the NDA going to be lifted or is that information also under wraps till the NDA is lifted?
a b U Graphics card
May 7, 2007 12:32:23 PM

May the 14th
May 7, 2007 5:19:33 PM

Whew, that's a relief. Just a week to go and then I can upgrade.
May 7, 2007 5:48:59 PM

The XTX may not even appear:

"ATI had originally planned to take on the 8800 GTX with a flagship board known as the Radeon HD 2900 XTX. Performance tweaks for the XTX were to include a core clockspeed boost to around 850MHz and a massive 1GB of GDDR4 memory running at over 2GHz. However, even with these enhancements, Nvidia's increasingly impressive GeForce 8800 GTX remained out of reach.

Without the ability to claim performance bragging rights and lumbered with all that pricey video memory, the XTX is very likely to be stillborn. It's thought the partner companies that produce and market retail boards based on ATI's video chipsets turned their noses up en masse at the XTX.

All in all, it's extremely bad news for both ATI and parent company AMD. Thanks to the lengthy six-month delay of the R600 GPU, Nvidia has had the high end all to itself since the GeForce 8800 launched last November. And with AMD simultaneously taking a beating from Intel's Core 2 processors, times are extremely tough for the beleaguered AMD-ATI empire."

Clicky
May 7, 2007 6:30:01 PM

Here is a video of Ati testing it. :p 
Ati test :p 
!