mactronix :
Didnt they know this?
Did they do it on purpose? or did someone get the math badly wrong and are now looking for another job.
They likely didn't know what the GF8600 would have different (these things are designed about 1-2 years ago), but it's an A/B/C choice, you try and balance what you think you need. Ifyou think you have a stressful apps that need lots of shader power but the target market will likely be on 1024x768, 1280x800, 1280x1024, 1440x900 resolutions, then you don't worry about having a ton of ROPs for high resolutions, you worry about the shader and texture workloads. That's likely the thinking. However I'm surprised they didn't learn from the X1600 since it only had 4 ROPS versus the GF7600's 8 ROPs, so when they only had 1 ROP cluster of 4 in the HD2600 I was shocked. I think we've come to expect 1600x1200 / 1600x1050 resolution out of this range, but I don't think they quite got that yet, despite having to make the X1650XT 8 ROPs last round. But it's thesame 1/2 shortfall they have on the HD2900 too, and the one weakpoint of the X1900 series too, so it seems there's a group think about ROPs there. Where they're looking very long term to a point when the shaders can't fill the ROPs, while at launch they show poorly because the competition can easily fill the double the ROP amount, and thus get wicked high top fps numbers, that help an average. Looking at many scenarios in benchmarks like those on Xbit Labs and firingSquad you see the HD series gaming like in a tunel, no major dips, but also no major highs.
I do realise that the arcitecture is completley diff to what anyone was used to before,but you gota think that at least the guys that designed and built it would know what it was capable of.
Yeah, but I still think they were thinking in their head of a more DX10 present than what we currently have.
Remember Halo2PC, FSX, Crysis, and UT3 were all initialy touted as Vista launch titles, due this time last year, with the R600 and G80 supposed to come out just afterwards. The reality the HD series lives in is much different.
As you say if you build an engine that turns out c*** you fit a turbo right?
So should we expect more shaders/rops on the refresh cards or will that have to wait for a new model.
Well it's not about fitting an aftermarket turbo, it's like chosing a 4cyl turbo or a V6, etc from the start, afterwards you add things like nitrous or headers, etc.
I think the refresh will attack all areas for both, with ATi likely to boost the Shader count and ROPs and lastly maybe boost Texture Units (a diminishing return point after this), nVidia will likley boost their Shader count and maybe TUs, and likely leave the ROPs alone. And then both will boost their memory bit/bandwidth. That would be my guess on their approach.