Sign in with
Sign up | Sign in
Your question

New gen Nvidia and ATI card question

Last response: in Graphics & Displays
Share
November 6, 2007 10:02:19 PM

This is just out of curousity about the already released 8800 GT and about whats to come from ATI with the 3800's. Why are both companies going back to a 256 bit bus when the immediate gen cards are 320, 384 and 512 bit respectively? Does that have anything to do with DX10 architecture, problems with the cards rendering or just no reason at all? I would only figure that as time goes on and the graphics get better the market would want the graphics to look better and better but, and this may be due to some lack of knowledge but wouldn't going back to 256 bit be a step backwards? Just kind of wondering and if I am showing any lack of knowledge some finer points to clear things up.
November 6, 2007 10:08:03 PM

Well, I'm not going to claim to be even halfway knowledgeable in this subject matter, but I will offer my opinion.

I think the reason that the new cards are coming out as 256bit is because there really needed to be a release of a good midrange card. If you think about it, the whole mid-range section has been pretty much bare for about a year now.

On that note, if the bit sizes were 320, 512, etc. then we would see an induction of very good preforming cards(as the 8800GT with 256bit preforms very good)...thus just making another high-end, expensive card.

Also, it may have something to do with a die shrink. It may not. I don't really know what that is, lol.

Hope I helped,
-Adam
November 7, 2007 7:37:23 PM

I'm no expert, but I'm sure there is an economy to be made with manufacturing something thet's been around a while (256 bit bus) other than the higher values. Or maybe that's BS.

Just my two pence worth.
Related resources
November 7, 2007 8:13:43 PM

Four fewer memory chips? It's a possibility (and it supports crusoe's argument).
November 7, 2007 8:29:39 PM

Current textures arent really exceeding the 256bit bus

the 8800gt(256bit) has nearly the performance of the 8800gtx(384bit)

and the 8800gt(256bit) beats the 8800gts(320bit) and ATI 2900xt(512bit) handily.

So for current games and resolutions 256bit seems to be enough, and the 320, 384, and 512bit seem more like a marketing ploy.

That said I expect the next gen high end cards to be 512bit.

Why? because it sounds cooler, duh...

Just like going to PCI-E 2.0 or SATA-2 before the bandwidth of the prior generation is even close to being met.
a b U Graphics card
a b Î Nvidia
November 7, 2007 9:13:54 PM

jwolf24601 said:
Current textures arent really exceeding the 256bit bus

the 8800gt(256bit) has nearly the performance of the 8800gtx(384bit)

and the 8800gt(256bit) beats the 8800gts(320bit)


Well the problem with that comparison is that the G80 has poor texture/memory management, so even a refresh of the 320bit design would've offered improvements in that area. Also the texture addressing units have increased so you've improved the performance of the TUs as well as removing a defect. so it's not a good comparison.

Quote:
and ATI 2900xt(512bit) handily.


Totally different architecture with much lower Texture power and with much more efficient memory management. So once again not a good comparison. the HD3800 may offer a different look into the HD2900's 512bit memory interface because that's one of the few differences.

Quote:
So for current games and resolutions 256bit seems to be enough, and the 320, 384, and 512bit seem more like a marketing ploy.


That's fine for some situations, but look at how the GT drops off a cliff in Crysis at not even ultra-high resolution or AA, just enabling HQ AF, and it seems that bitwidth could easily run out in comparison.
http://www.bit-tech.net/hardware/2007/11/02/nvidia_geforce_8800_gt/8



Quote:
Just like going to PCI-E 2.0 or SATA-2 before the bandwidth of the prior generation is even close to being met.


However there are situations when both are saturated, but they are specific and rare, and in this case we're talking about cards aimed at the specific and rare user.

The main thing is the cost of the 320+ bit interface is high in either wire trace / pcb complexity costs, or in transistor count for he ring-bus style compensation to avoid the wire trace issue, either stuation doesn't warrant the cost for a 'mid-range' card, especially when the benefits aren't even being fully exploited quite yet by this segment of user.
November 7, 2007 9:30:04 PM

So is the 8800GTS-G92ver going to have a 320-bit interface? 384-bit? 512-bit? Just curious if anyone has seen any indicators.
a b U Graphics card
a b Î Nvidia
November 7, 2007 9:33:25 PM

The G92 based GTS is rumoured to remain 256bit. There's no way it's 512 bit, almost as unlikely 384 bit, and only mildly possible to be 320, and even that is highly unlikely.
!