Why Ati 2600XT sucks???

ferencster

Distinguished
Jul 23, 2007
97
0
18,630
I have been waiting for this card so much and now I am so dissapointed.... :pfff:
In paper the 2600XT(120 shader, 800 core speed, 2200 ddr4) looks much better than the 8600 gts (32 shader, 675 core speed, 2000 ddr3), but in real world even a 8600 gt can beat it :(
Any ideas why is this happening? (same thing with 2900XT and 8800GTX)
Is this a driver issue? Any experiences with 2600XT ddr4???
 
Actually the HD2600XT with GDDR4 beats the plain GT more often than not, but it's still short of the GTS more often than not.

Biggest problem is the # of ROPs and it's hamstrung there.

Look at the performance in more DX10 centric apps where the shader core is stressed more and the XT leaves the GTS in the dust, even coming up upon the GF8800GTS sometimes.

Sucks, the best options for the price usually turn out to be the last generation's top DX9 cards.

darren from THG wrote and article about it, shoudl still be prominent in the graphics section of the front page of Tom's.
 
Just wanted to get my thinking straight on this it seems to me that the 8800s are basically v/good dx9 cards that can do a bit of dx10 and the new HD cards are made to run dx10 so have performance probs in dx9.
Also im not the most tec oriantated but was wondering about what MS did with the memory virtualisation i heard they said it was a must for dx10 and ATI mansged it but Nvidea couldnt so MS dropped it.Now how badly did this hurt ATI or has it given them a head start for next gen, is/will memory virtualisation be a must?
Thanks Mactronix
 
Well that is about the situation, I don't think the virtualization hurts or helps either very much because there isn't a great demand for it (I don't think they dropped it I think they made it 'optional', the way a few things have been in the past DX versiosn [like vertex texture fetch / RV2B, etc).

What hurts them (AMD) the most is their decision to make their AA rely on the DX10 standard, and not have the legacy hardware AA resolve support. Which is like passing up on a turbo boost. It doesn't matter much when you think your DX10/V10 solution is fine, but if the competition comes to market with their V8 with turbo, then many times you can be caught at a disadvantage.

As we've seen the XT can keep pace with the GTS (whichi would be nV's V6 turbo) even under current situations, but when you move to the shader based AA where both of the cards carry the same load and neither has the benefit of the dedicated AA hardware, then you see the XT being very competative with the GTX.

With the HD2600 series though it's a little different from the higher end, since while it has good shader #s, and a better texture unit ratio than the HD2900, it's still greatly held back by the lack of ROPS, with only a single rop cluster (for 4 pixels), while the GF8600 can output 8 pixel.
 
Thanks for the answer ape just one more thing if you dont mind,ive cut this from your post.
With the HD2600 series though it's a little different from the higher end, since while it has good shader #s, and a better texture unit ratio than the HD2900, it's still greatly held back by the lack of ROPS, with only a single rop cluster (for 4 pixels), while the GF8600 can output 8 pixel.
Didnt they know this?
Did they do it on purpose? or did someone get the math badly wrong and are now looking for another job.
I do realise that the arcitecture is completley diff to what anyone was used to before,but you gota think that at least the guys that designed and built it would know what it was capable of.
As you say if you build an engine that turns out c*** you fit a turbo right?
So should we expect more shaders/rops on the refresh cards or will that have to wait for a new model.
Thanks Mactronix
 


They likely didn't know what the GF8600 would have different (these things are designed about 1-2 years ago), but it's an A/B/C choice, you try and balance what you think you need. Ifyou think you have a stressful apps that need lots of shader power but the target market will likely be on 1024x768, 1280x800, 1280x1024, 1440x900 resolutions, then you don't worry about having a ton of ROPs for high resolutions, you worry about the shader and texture workloads. That's likely the thinking. However I'm surprised they didn't learn from the X1600 since it only had 4 ROPS versus the GF7600's 8 ROPs, so when they only had 1 ROP cluster of 4 in the HD2600 I was shocked. I think we've come to expect 1600x1200 / 1600x1050 resolution out of this range, but I don't think they quite got that yet, despite having to make the X1650XT 8 ROPs last round. But it's thesame 1/2 shortfall they have on the HD2900 too, and the one weakpoint of the X1900 series too, so it seems there's a group think about ROPs there. Where they're looking very long term to a point when the shaders can't fill the ROPs, while at launch they show poorly because the competition can easily fill the double the ROP amount, and thus get wicked high top fps numbers, that help an average. Looking at many scenarios in benchmarks like those on Xbit Labs and firingSquad you see the HD series gaming like in a tunel, no major dips, but also no major highs.

I do realise that the arcitecture is completley diff to what anyone was used to before,but you gota think that at least the guys that designed and built it would know what it was capable of.

Yeah, but I still think they were thinking in their head of a more DX10 present than what we currently have.
Remember Halo2PC, FSX, Crysis, and UT3 were all initialy touted as Vista launch titles, due this time last year, with the R600 and G80 supposed to come out just afterwards. The reality the HD series lives in is much different.

As you say if you build an engine that turns out c*** you fit a turbo right?
So should we expect more shaders/rops on the refresh cards or will that have to wait for a new model.

Well it's not about fitting an aftermarket turbo, it's like chosing a 4cyl turbo or a V6, etc from the start, afterwards you add things like nitrous or headers, etc.

I think the refresh will attack all areas for both, with ATi likely to boost the Shader count and ROPs and lastly maybe boost Texture Units (a diminishing return point after this), nVidia will likley boost their Shader count and maybe TUs, and likely leave the ROPs alone. And then both will boost their memory bit/bandwidth. That would be my guess on their approach.
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815
TGGA:

It seems LOTs of people are asking the same thing. Why doesn't the 2900XT blow away the 8800GTX? Specs look dynamite, but benchmarks lag. In the past, the next gen cards was always faster on the previous appliations. This might be a good THG article!
 

ryokinshin

Distinguished
Mar 19, 2006
605
0
18,980
mid range cards are suppose to beat last gens high end cards or have the same performance and be lower priced, perfect new examples are 7600gs/gt and 1650xt/pro, this does not happen with this round actually really sand this happened this round, but sure enough nvidia and ati are crapping themselves to make revisions

the 2900xt does beat the 8800gts 640mb
 

gpippas

Distinguished
May 11, 2007
463
0
18,790
My guess would also be that they will address the bandwidth limitations of these cards by increasing the memory bus. Having 128bit memory bus is very limiting to the bandwidth making it harder for the gpu to process the higher resolutions and textures. 256 bit memory bus should be sufficient on these mid range cards.

TheGreatGrapeApe can correct me if im wrong because he knows more about it than I do.
 

Jakc

Distinguished
Apr 16, 2007
208
0
18,680
I heard that new allready available catalyst 7.8 beta driver increases 2600xt performance by up to 15%.
Also, the performance of AAA (adaptive anti aliasing) has increased even more.