How high can the 320MB cards go in resolution vs 640MB?

jbrownos

Distinguished
Feb 22, 2006
39
0
18,530
I'm looking at buying a 22in widescreen monitor that natively runs at 1680x1050. I'm also wanting to upgrade my current 7800GT to match. I'm trying to decide whether the 320MB or 640MB version of the 8800GTS is the best choice. I've never been one to buy a lower-end choice, but it will be stretching my budget buying both a GPU and monitor. And if there really is no foreseeable difference, then the choice seems obvious.

The VGA charts actually show the 320MB 8800GTS used as (barely) outperforming the 640MB, in some cases even in 1920 resolutions. So no real answer there (or maybe there is?) I'm just curious if and when the memory difference becomes an issue. I'd definitely like to know if it will be an issue at 1680x1050 (I suspect it won't), but I'm also curious how high you have to go to see a significant difference.
 
BOth are good and solid cards , i always say go for 8800gts 640 if u play @ resolutions higher than 1600x1200 or 1680x1050 , (that included your resolution too)so i say get the 640one , there are some games like GRAW1-GRAW2-DOOM3, Which need more than 512MB to run @ ultra high
 

rammedstein

Distinguished
Jun 5, 2006
1,071
0
19,280
im saying the 320mb, you dont need the memory but the bandwidth, the 320 overclocks memory easier because less 'drive strength' is required to switch the transistors (like, half the transistors) so higher chance of higher overclocks.
 

At that resolution the 8800GTS 320mb should do near equal FPS and will save you about $100. As for 1920 you would want to go for the 8800GTX maybe even SLI 2. I see very little room for the 8800GTS 640mb as any resolution needing 640mb you would need the extra power of the GTX.
 

jbrownos

Distinguished
Feb 22, 2006
39
0
18,530
I just wanted to express my appreciation for all of the advice. I'm kind of leaning towards the 320MB at this point I think. As I said, I usually tend to lean towards the higher-end of any two choices. But in this case I'm just not totally sure I'm going to get $100 worth of difference. And I figure these are still first-gen DX10 cards, so I'm better off not spending too much. Then I won't feel too bad about upgrading when the new API version gets more mainstream.