This is just out of curousity about the already released 8800 GT and about whats to come from ATI with the 3800's. Why are both companies going back to a 256 bit bus when the immediate gen cards are 320, 384 and 512 bit respectively? Does that have anything to do with DX10 architecture, problems with the cards rendering or just no reason at all? I would only figure that as time goes on and the graphics get better the market would want the graphics to look better and better but, and this may be due to some lack of knowledge but wouldn't going back to 256 bit be a step backwards? Just kind of wondering and if I am showing any lack of knowledge some finer points to clear things up.
Well, I'm not going to claim to be even halfway knowledgeable in this subject matter, but I will offer my opinion.
I think the reason that the new cards are coming out as 256bit is because there really needed to be a release of a good midrange card. If you think about it, the whole mid-range section has been pretty much bare for about a year now.
On that note, if the bit sizes were 320, 512, etc. then we would see an induction of very good preforming cards(as the 8800GT with 256bit preforms very good)...thus just making another high-end, expensive card.
Also, it may have something to do with a die shrink. It may not. I don't really know what that is, lol.
Current textures arent really exceeding the 256bit bus
the 8800gt(256bit) has nearly the performance of the 8800gtx(384bit)
and the 8800gt(256bit) beats the 8800gts(320bit)
Well the problem with that comparison is that the G80 has poor texture/memory management, so even a refresh of the 320bit design would've offered improvements in that area. Also the texture addressing units have increased so you've improved the performance of the TUs as well as removing a defect. so it's not a good comparison.
and ATI 2900xt(512bit) handily.
Totally different architecture with much lower Texture power and with much more efficient memory management. So once again not a good comparison. the HD3800 may offer a different look into the HD2900's 512bit memory interface because that's one of the few differences.
So for current games and resolutions 256bit seems to be enough, and the 320, 384, and 512bit seem more like a marketing ploy.
Just like going to PCI-E 2.0 or SATA-2 before the bandwidth of the prior generation is even close to being met.
However there are situations when both are saturated, but they are specific and rare, and in this case we're talking about cards aimed at the specific and rare user.
The main thing is the cost of the 320+ bit interface is high in either wire trace / pcb complexity costs, or in transistor count for he ring-bus style compensation to avoid the wire trace issue, either stuation doesn't warrant the cost for a 'mid-range' card, especially when the benefits aren't even being fully exploited quite yet by this segment of user.