Sign in with
Sign up | Sign in
Your question

HD2900 XTX Bench'd

Last response: in Graphics & Displays
Share
April 26, 2007 6:02:01 AM

Drivers, I know, not DX10, yes yes, but still funny none the less.

Worth the wait?!?!?!?! HA :lol: 

Dailytech's bench of HD2900 XTX

More about : hd2900 xtx bench

a b U Graphics card
April 26, 2007 6:15:11 AM

So no xtx eh? DAAMIT!
a b U Graphics card
April 26, 2007 6:18:17 AM

Looks like ATI will have to go the rumored 650 route to attempt the lead
April 26, 2007 6:29:47 AM

Uh Oh, this isn't looking good.

What's the 650 route your referring to? The Refresh?
April 26, 2007 6:46:03 AM

This is bad news, i've been waiting for this card for ages and this is how it performs. Looks like i'm going to postpone the assembly of a new computer and wait for something new to come out with decent performance. I was just anxious to build new computer without really the need for it.
Now i'll just wait untill Crysis arrives and get the best bang for buck then. Buying 8800 GTX when it is half a year old and still costs almost the same as it did when it came out is not an option. Perhaps i'll grab a wolfdale or a Yorkfield by then too.
April 26, 2007 6:53:19 AM

nVIDIOWNED
April 26, 2007 7:13:29 AM

That's just sad.
April 26, 2007 7:15:41 AM

What is really disappointing are the 1920x1200 scores. And since that is the resolution of my screen I don't think R600 is an option really.

And why on earth are the XTX scores lower than the XT ones in about half the tests? Very, very disappointing.

Lyngvi.
April 26, 2007 7:44:12 AM

I think that part of the problems that the guys over at DailyTech are having with the XTX card, is that it's the older 80nm OEM version of the card. Remember that DAAMIT did a die shrink of all their new HD cards down to 65nm, and supposedly that's what the delay is all about.

The OEM version is not the final working product, and I seriously doubt that DAAMIT would even consider releasing the XTX card, if it's only 2% faster, and only in some games than the XT card.

http://www.dailytech.com/ATI+Radeon+HD+2900+XT+Performa...

Here's the Radeon HD 2900XT doing swell. If the real XTX would please step forward...

I'd say give it time...these pre-release benchmarks are often a bit premature. Driver maturity as well as finalized products will tell all. I'm sure it'll be a close one for the real HD 2900XTX and the 8800Ultra cards.
a b U Graphics card
April 26, 2007 10:26:41 AM

Based on those results, the XTX looks like it sux.

I question their results in some ways, especially Oblivion. What map area and what settings are you testing to average 100 fps at 19x12? Not the areas that matter (foliage max details, eye candy)

Anyway, still admit their XTX looks pathetic compared tot he XT, never mind the OC'ed 8800GTX.

And Fudzilla claims the same thing:
http://www.fudzilla.com/index.php?option=com_content&ta...
April 26, 2007 10:54:16 AM

I think I'll wait for the THG bench marks myself. I really doubt that ATI would realease a card that was only just better than their previous 1950XTX card. Who is Dailytech anyawy?
a b U Graphics card
April 26, 2007 11:05:51 AM

Quote:
I think I'll wait for the THG bench marks myself. I really doubt that ATI would realease a card that was only just better than their previous 1950XTX card. Who is Dailytech anyawy?

I agree with you, we need to wait and see what is launched and how they then perform. But it isn't looking good so far. The XT could still be a hot card for $399, but not taking the performance crown back is a big negative for ATI.

How are you comparing it to the X1950XTX? I haven't seen them compared and your really can't compare results from these tests directly with numbers from any other review.

I don't know much about Dailytech or how they got their cards even. They are suposedly more trustworthy than the Inq or Fudzilla, yet 100 fps in Oblivion at 19x12 makes me question their benchmarking abilities. I hope they are ATI's tool for spreading low performance FUD before a launch that puts us in Awe. :roll:
April 26, 2007 11:17:38 AM

:(  Nvidia is laughing all the way to the bank, ATI really screwed up this time.
April 26, 2007 11:19:32 AM

Quote:
I think I'll wait for the THG bench marks myself. I really doubt that ATI would realease a card that was only just better than their previous 1950XTX card.


Nvidia has the G80 out for months and is raking in record amounts of money while AMD keeps shoveling dollars into the fire, dancing around it singing "delay, delay, delay".
I´m not sure they had a choice when to release it. Maybe they should´ve waited until Nvidia released the 8900 - nobody would´ve believed the 8900 GTX would have twice the frames of a 2900xtx in oblivion... :lol: 
April 26, 2007 11:20:04 AM

So how many version of R600 is ATI going to launch first and what are they?
April 26, 2007 11:30:17 AM

I'm really shocked, and have mixed feelings.

On one hand I'm elated that my 5 month old card is still the best I can buy. When you buy something like that you expect to hear about the next best thing a month later.

On the other hand I want these two companies to fight it out. I want to know that the next time I buy a card the performance will amaze me. I truly wanted the 2900xtx to kick the *%^& out of the 8800gtx.

I'm still reserving final judgement for both cards until we see performance in a dx10 environment.

ATI still has a great opportunity to be competitive at the low to mid range where the real market is. That is there current strength and is also where Nvidia's new cards show weakness. ATI may have the better solution for OEM's and still be the market leader in discreet graphics while conceding the high end.
April 26, 2007 11:47:34 AM

The war is over. period

When did it happen the last time that a video card wasnt bested by a competitors card launched afterwards (especially so much later)?
a b U Graphics card
April 26, 2007 11:48:26 AM

As I mentioned above, the Oblivion results are worthless. Must be no fsaa no af and in a non-gpu demanding area of the game to get 100 fps at 19x12.

Anyway, just found this pic on another forum. Note they said the HD 2900XT got 47.9 fps in Oblivion at 12x10 when it beat the GTS 640MB. Now they changed the benchmark as the same card score 101.4 fps at 12x10 in this test. Same card with more than double the framerates. :roll:

I am not saying all their results are off, or even that the 8800GTX won't win Oblivion. All I am pointing out is the sheer weakness of their Oblivion benchmarking. Probably two different reviewrs, but still, the same site...get it right guys. :roll:
April 26, 2007 12:14:44 PM

So, why are the benchmark results so different on fudzilla and dailytech? where's the truth?

edit: sorry, not fudzilla but...maybe Inquirer or so..don't remember where did I read about R600 benches, but it was clearly faster than G80
April 26, 2007 12:25:54 PM

Quote:
Who is Dailytech anyawy?

The answer to that depends on who you ask. The are afiliated with Anandtech. Legally speaking, the two are separate entities, which means that "Dailytech" can publish benchmarks that are a violation of the NDA signed by "Anandtech" without getting sued. In reality they are exactly the same company.

Quote:
I think that part of the problems that the guys over at DailyTech are having with the XTX card, is that it's the older 80nm OEM version of the card. Remember that DAAMIT did a die shrink of all their new HD cards down to 65nm, and supposedly that's what the delay is all about.

This rumour has been repeatedly exposed as false. There probably will eventually be a 65nm version of R600 but it will absolutely, positively not be launching in May/June. The XTX card (retail or OEM) is 80nm, just like the XT.
April 26, 2007 12:31:55 PM

The benchies may be true, but with all other websites having to sign non-disclosure agreements why on earth would AMD/ATI decide to allow fud and dailytech to post their findings.... maybe noone else thinks like i do, but until hexus/beyond3d or Toms posts something conclusive with a comparison between a Sapphire or Connect3D version of these cards I won't be taking any benchies for granted.

By the way I am not an ATI fanboy, never had an ATI card before always used Nvidia, current card is the BFG 320mb 8800 GTS, but i did like the competition they provided to keep nvidia on their toes.
April 26, 2007 12:51:21 PM

Quote:
The benchies may be true, but with all other websites having to sign non-disclosure agreements why on earth would AMD/ATI decide to allow fud and dailytech to post their findings.... maybe noone else thinks like i do, but until hexus/beyond3d or Toms posts something conclusive with a comparison between a Sapphire or Connect3D version of these cards I won't be taking any benchies for granted.


See nicolasb's post above.
April 26, 2007 12:58:59 PM

Doh...! where did that post come from, i was replying to Dagoth....nice way of making a fool of meself..:) 
April 26, 2007 1:08:16 PM

Quote:
The benchies may be true, but with all other websites having to sign non-disclosure agreements why on earth would AMD/ATI decide to allow fud and dailytech to post their findings.... maybe noone else thinks like i do, but until hexus/beyond3d or Toms posts something conclusive with a comparison between a Sapphire or Connect3D version of these cards I won't be taking any benchies for granted.

By the way I am not an ATI fanboy, never had an ATI card before always used Nvidia, current card is the BFG 320mb 8800 GTS, but i did like the competition they provided to keep nvidia on their toes.


hmm...Well, I thought that TGdaily is somehow close to tomshardware guide, isn't it? Sounds to me like "Toms hardware Guide daily"...so, these results should be trustworthy,right?
April 26, 2007 1:09:47 PM

If you take time to read the extensive thread under the article the article author admits the settings were no AA, AF etc in Oblivion.

Does this make the results useless? No.

If the results are true, they hardware was alike other than the GPUs so the results would still be a fair comparison whether every eye candy box was ticked off or if none were.

Now if it showed CPU limitations I'd call foul. But like I said assuming these results are true results they got then they are very valid showing a differential between the two cards in question.

I honestly hope they can do something to get the performance closer, just for competitions sake if nothing else.

I'd like to think driver issue as many are speculating however I seriously doubt drivers are so bad as to cause this much difference since they were release candidate drivers. Shrug. We'll see once more sites can release their info.
April 26, 2007 1:13:01 PM

Quote:
I think that part of the problems that the guys over at DailyTech are having with the XTX card, is that it's the older 80nm OEM version of the card. Remember that DAAMIT did a die shrink of all their new HD cards down to 65nm, and supposedly that's what the delay is all about.

The OEM version is not the final working product, and I seriously doubt that DAAMIT would even consider releasing the XTX card, if it's only 2% faster, and only in some games than the XT card.

http://www.dailytech.com/ATI+Radeon+HD+2900+XT+Performa...

Here's the Radeon HD 2900XT doing swell. If the real XTX would please step forward...

I'd say give it time...these pre-release benchmarks are often a bit premature. Driver maturity as well as finalized products will tell all. I'm sure it'll be a close one for the real HD 2900XTX and the 8800Ultra cards.


Audiosupernova: the oems as they 're named...., are the FINAL versions for big manufacturers such as dell & stuff :|

in other therms, they oced the XT version preety much... ( it went to GTX levels easy according to them )

Quote:
I think I'll wait for the THG bench marks myself. I really doubt that ATI would realease a card that was only just better than their previous 1950XTX card.


Nvidia has the G80 out for months and is raking in record amounts of money while AMD keeps shoveling dollars into the fire, dancing around it singing "delay, delay, delay".
I´m not sure they had a choice when to release it. Maybe they should´ve waited until Nvidia released the 8900 - nobody would´ve believed the 8900 GTX would have twice the frames of a 2900xtx in oblivion... :lol: 

dont get too happy, Nvidia might get hit with the class action suit because of non working drivers for way too long... :|
April 26, 2007 1:15:52 PM

Quote:
So how many version of R600 is ATI going to launch first and what are they?


All I know is that AMD is launching 10 cards in the 1/2 of May.

As for the XTX...wow what a weak performer. Why bother making it if it sucks this badly?
April 26, 2007 1:16:31 PM

I never really expected the R600 to do good at high resolutions anyways. At least not in DX9 situations, because of it only haveing 16 ROP's.
April 26, 2007 1:26:21 PM

*Looks at his CF board in dismay.....*
April 26, 2007 1:39:33 PM

I hope this isn't true. :( 
April 26, 2007 1:47:51 PM

This is sad. I really hope this drives the GTX prices down so I can get one.
April 26, 2007 1:52:26 PM

Quote:
This is sad. I really hope this drives the GTX prices down so I can get one.


Yeah me too...however, according to the benches, if R600 will perform like 8800GTS 320 or so, nVidia will have no reason to cut the price of GTX
April 26, 2007 2:12:28 PM

I know I may sound like a fanboy, but I'm looking really closely at AMD. ATi are now in the exact same position that AMD is in against Intel.

- Inferior product on current store shelves
- Facing Goliath sized competition
- Competition have got good backup products just in case (Penryn and 8800 ULTRA)
- Problems with mass production
- Delays
- Forced to put products at fire sale prices
- Delays
- Uncertain future regarding products and business itelf
- Delays
- NO NEWS WHAT SO EVER
- Delays

I mean, I'm just poking fun here so don't take it seriously, but AMD and ATi (when they were different) have always had troubles with punctuality. What the hell would happen when you put the two together?
Old AMD + ATi = Another Microprocessor Delay :twisted: :twisted:

K, I'll stop the jokes... Seriously though, this is a huge bummer.

Let's just hope that DT got a dud version or something, and that the R600 XTX will prove itself in a review on guru3D/Anandtech/THG/Xbit Labs/etc...

I was really looking forward to the HD 2900 XTX. :[
April 26, 2007 2:21:44 PM

dang man, at least the hd 2900xt delivers, we know it beats the 8800gts 640, and at similar pricing i think ati wins for the higher end but crap. What the heck happened to the xtx? I guess so highest end for ati
April 26, 2007 2:27:58 PM

If the R600 is crappy then ATI could cut the price down to make up for it.
a b U Graphics card
April 26, 2007 2:33:23 PM

Quote:
If you take time to read the extensive thread under the article the article author admits the settings were no AA, AF etc in Oblivion.

Does this make the results useless? No.

If the results are true, they hardware was alike other than the GPUs so the results would still be a fair comparison whether every eye candy box was ticked off or if none were.

I completely disagree. We all know what matters in Oblivion is the demanding areas of the game. Shoot, a 7900GT does well in some areas of the game but tanks in others. Do you care if you get 60 fps in non demanding areas if you get 11 fps in demanding areas? Which area determines the settings you play at? The demanding area of course.

And an 8800GTX is not a 100+ fps card at 1920x1200. Plain and simple the second reviewer didn't take the time to even bench an area of Oblivion that stresses the GPU's. IMO, it's still a worthless benchmark. I laugh at any site that posts 1 single Oblivion benchmark that scores 100+fps at all resolutions. :lol:  :lol:  :lol:  :lol:  :lol:  They don't know Oblivion. :roll:

No informed 8800GTX or HD2900XT owner would ever play Oblivion without FSAA or AF. What a waste. Second, no card today offers 100+ fps outside in the demanding areas of Oblivion at 1920x1200.

So to some it up, Oblivion + top of the line cards + only one benchmark + no minimum framerates, only average + 100 fps or more at all resolutions, + no fsaa or AF = a worthless Oblivion benchmark. That should be Obvious to anyone here.



If you still don't get it, What if this 1280x1024 chart was labeled Oblivion benchmark. The 7900GT and X1900XT both get 91 fps! Wow, they are equal and can crush Oblivion. :lol: 
http://www.firingsquad.com/hardware/oblivion_high-end_p...

Oh wait, what if you go out where it counts in the dense foliage..... http://www.firingsquad.com/hardware/oblivion_high-end_p... The 7900GT now averages 17 fps vs 30 for the X1900XT. Neither card ownz Oblivion, but the X1900XT is Obviously better. Now look below at the minimum fps chart and the 7900GT drops to 14 fps while the X1900XT doesn't go below 27 fps.

So let me ask you are they equal cards both capable of 91 fps in Oblivion? Or does it matter more that outside the X1900XT doesn't drop below 27 fps while the 7900GT drops to 14 and averages 17 fps? Which chart is the worthless benchmark to the Oblivion player shopping for a new card? Got it now? :wink:

WORTHLESS BENCHMARK! Case closed! :tongue:
April 26, 2007 2:51:13 PM

Was just about to say the same thing.

It seems 8800GTX evolved by mean time and while it was giving 58 fps at 1024x768 resolution in that reciew it gives 98fps at 1920x1200 in OBLIVON :D :D :D , it was a big lough.

This benchmark is really crap, the writer made it up. By the way what the ... is Dailytech? Never heard of them ...

I guess excel is the only thing you need to bencmark cards these days...
April 26, 2007 2:53:21 PM

Yeah so the benchmark was shit, regardless there has been news as someone else reported of the poor performance of the XTX's.

Unless you all missed it, look at the core clock speed of the XT vs. the XTX, wow a whopping increase of 5mhz.

And while the increase in memory speed is very large, there using GDD4 which has a little bit higher latency. So while its a good increase in speed the performance result is not as large.

Now look at the difference in core clock between the 8800GTS and the 8800GTX Dailytech used, 150mhz!

So while Dailytech's version doesn't look good at all maybe it was not clocked right and also maybe there was some other unknown factor like a driver issue.

Right now its easy to see that the XTX that Dailytech used, while not even looking at the performance charts, is an XT with a increase a 5mhz on the core a is using GDDR4... nothing special to rave about at all. So again without even looking at the performance charts one could assume that the XTX would perform very similarly to a XT.

So now relate the XTs performance to the(the more legit looking review yesterday or so) GTS, and then the GTS to the GTX and you will see that it looks obvious the GTX would outperform the XTX.

Hopefully come product release time the core clock is actually increased and the maybe buggy drivers or memory issue is fixed.
a b U Graphics card
April 26, 2007 3:11:23 PM

My biggest question is why they needed to use an overclocked GTX to compare to the stock XTX? :?:
Were they trying to reduce the memory bottleneck on the GTX, which has shown lots of benefit of faster memory?

Once again I'll wait for final benchmark results from a company that doesn't have a history for shaping reviews in a particular fashion.

Still very wierd, and also makes me ask these things are obviously not optimized nor obviously at their bandwidth saturation point (wonder where that would be with 512bit memory) especially if AMD improved on their already great memory management.

Still interesting, and a little dissapointing, but not really that surprising considering the small core mhz difference as has already been mentioned, and at those level likely not memory throughput intensive.
April 26, 2007 3:16:57 PM

So this revew gets DAS BOOOOT?!?!
a b U Graphics card
April 26, 2007 3:19:13 PM

Quote:
I completely disagree. We all know what matters in Oblivion is the demanding areas of the game.


I disagree, obviously Oblivion's beauty is found in the dark interior shots, especially cave walls, beautiful, where my MRX700 can give me triple digit fps @ native 14x9, now that's what I'm talking about smooth framerates on a smooth featureless wall.

[in my best MOE voice] He is so stupid. And now, back to the wall...
[/MOE]
April 26, 2007 3:27:09 PM

Ya rite,the xt out performs the xtx,this is the biggest load of crap Ive ever seen...and those making buying decisions based on this are even more foolish...I smell lawsuit.
a b U Graphics card
April 26, 2007 3:27:34 PM

Quote:
So this revew gets DAS BOOOOT?!?!


Well it's still info, but it's reflection of what we'll see when it finally hits the shelves with updated drivers is questionable IMO.

I never like looking at any single one review, even the great in depth ones like Digit-Life and Xbit Lab's, it's always good to see people work outside the norm (1600x1200 as max res ? Wouldn't that show no benefit to the GTS-640 and little to the GTX compared to the GTS-320 overclocked?)

It's interesting, but I also question why the workstation apps were dropped (something that is HUGELY bandwidth limited) and the 3Dmarks were droped (that were the only thing they benched the OC'ed XT on), like so many Anand reviews, usually the things that are dropped fromm test to test have a general pattern, and the one I see here is that those are the only areas that the XTX might shine in and might go contrary to the page hit generating 'DOOMED' title. Finish that part of the benchmark and I would get at least a better understanding of what was going on, because right now all we see is that there's little difference, but we don't know why. Using more tests on the OC'ed XT core would show if it's core bound or if it's that it's not being stressed.

Also something for people to consider is "What's the minimum/max FPS for these cards? " If the lows are smoother for either solution it would show that while the average may be something that particular card is doing better at cutting through the chop.

Just look at the review itself and ask yourself if you would ever buy a card based on just that? To me there's only a smattering of information there.
April 26, 2007 3:29:52 PM

I just wanted to say 'DAS BOOOOOOT!!'.
April 26, 2007 3:37:31 PM

The question I have here is why is Dailytech so privileged. The traffic they have received the past two days is worth some serious money. Do a search on r600, 2900 etc and almost every hit you get links back to Dailytech.

If ATI's card should have shown better then it is time to lift some NDA's.
They just spent a fortune pampering the press while the average enthusiasts are chewing at this one bone we've been thrown.

Enough already, you've been called, show us you're hand.
April 26, 2007 3:58:18 PM

These are most certainly false benchmarks... or this is most certainly DAAMIT's best morons at the helm.

1) Did they not compare performance between their own cards and nvidias? It was out early enough for their die shrink...
2) Why bother putting 320 stream processors on the XT and XTX versions if the only difference between the two is clock speeds (and memory)? This is another 6800GT vs Ultra setup... just buy the XT and o/c for a free XTX. (EDIT: And why bother with 320 stream processors if they're not going to be ramped up to the speeds they're supposed to be at, and if they're not even going to beat nvidia's 128???)
3) DT can't have a card if they're even making the reporters in tunisia sign an NDA from even looking at it.
4) This is just a huge pile of horse dump. If DAAMIT messed this up, they deserve to leave the business. You can't tell me they were stupid enough not to hold an 8800GTX in their hand and ask themselves, "Do i want to make money?"

(EDIT: Even assuming the shaders run at half speed (heard nvidia's were double pumped) then why aren't these cards at least equal???)
April 26, 2007 3:59:26 PM

I also don't like the fact that the benchmark system was based on a NVIDIA 680i variety motherboard. I know, I know, there isn't supposed to be any advantage just because an NVIDIA video card is running on n nVIDIA chipset motherboard (without dual SLI at least). I'm still suspicious deep in my gut, howerver. 680i chipset engineers knew what the g80 engineers were working on and vice versa. Its hard for me to believe some optimizations didn't get incorporated as both MOBO chipset and GPU information got exchanged inside NVIDIA. Even if there is no true advantage for the GTX on an NVIDA motherboard, it just looks bad. I wonder why they couldn't have done their benchmarks on a more "neutral" motherboard like one with the Intel 965P chipset.

I also wonder if the GDDR4 RAM is causing problems. One of the rumors I read about the delays earlier is DAMMIT was having difficuly getting the GDDR4 memory to work properly on the cards. That is at least one of the major differences between the XT and XTX.

Otherwise, the benchmarks just don't make sense to me. I know the specs are just on paper, but those specs show the XTX should be able to out muscle the GTX or at least hold its own against it. Based on the XT results, it doesn't look like there are any problems with the R600 GPU processor. Oh well - I guess I'm in the same boat with a lot of other folks. Don't write off the XTX just yet. Wait for some more benchmarks. I wasn't planning on buying a videocard until June 1 anyway.

Rob
April 26, 2007 3:59:35 PM

I cried realy hard and was attempting suicide when I found out..
a b U Graphics card
April 26, 2007 4:06:00 PM

Quote:

Enough already, you've been called, show us you're hand.


I don't understand your logic behind AMD having to show this or lift NDAs because of an unfarouable or questionable review.

Hey I'd like nothing more than an NDA lift, but I don't see this as bbeing a need to reply.

Right now it's all pre-release so the only effect it would have is on those who are on the fence and need to buy right now, most people aren't going to be swayed by one early benchmark run and little in depth information. Heck we know more about the R600's design from the InQ's Vec5 blurb than we learned from either HD2900 review from Dailytech.

I hope that this does put pressure on for other leaks, but I don't see AMD releasing people from their NDAs until they're ready. Of course if alot of leaks get out there will be less reason for AMD to keep the NDAs.
April 26, 2007 4:16:29 PM

Actually you need to quote my whole statement to keep it in context, specifically this part.
Quote:
If ATI's card should have shown better then it is time to lift some NDA's.


I'm reading these threads and I hear a whole lot of: drivers, bad test methodology, pre-production board etc.

I am saying that if this or any of this has caused the result to slant out of ATI's favour then logically they should refute it. Hence the poker analogy. Are they folding or showing us a strong hand.
!