Which one is the wiser choice for long term/future proof?
Another question I came up with is:
IF, i decide on the 2GB GPU; Will I be able to SLI a 2GB card and a 1GB card if for some reason the 2GB card becomes hard to find in the future (say next 1-2 years)?
More memory are needed if you have some seriously large output, let's say a 1920/2560x?? or a two or a three large displays. The huge chunk of that said memory will render itself useless if the GPU chip can't cope with the requested performance it needs to be to balance it off.
Still, the card that you mentioned is a beast indeed. Personally, I would choose for the 1Gb variant. It should be enough for a single or a two multi-display. I'm not sure if we add a 3rd one though.
A 1GB memory theoretically, would still has some more memory to spare even on 1920. No need geek calculation for this, if you use win7, there should be a tool called GPU Observer. You can see it for yourself how much impact more memory has for larger screens. And I still think a 1GB is more than enough for yours.
However, if the $30 gap not much of a problem, going for 2GB is also wise decision if your aim is for a much larger video output, or at least expecting for such in the future.
To open your mind even more for this reason, read Tom's review about the latest HD7000 series which within each of their lineup holds a significant boost amount of memory than the last gen they replaced with on the same price tag.
Look Majestic, i know you want help but you have to take the help.....Rush is speaking correct here. If your running on 1 Screen, 42 inch isnt that bad, depends on your reslotuon 1920 isnt fully a valid resoltion, 1920x1080?, if so yea it should be ok. but i would go with a GTX 570. Not because of a few more of Vram but because the GPU is better. but if you can hold out. the Kepler Mainstream cards are to have 2GB native from what i read
The real benefit of 2GB+ buffers for gaming purposes will come when the next generation console arrives. Nowadays, Devs work with 256MB buffers on consoles, and design the game to run on 512MB buffers on the PC side. Obviously, some games will be over that VRAM budget, and some Devs will release high-def textures on the pc, but still far from using 2GB.
There's definitely a need for more than 1gb of VRAM even at 1080p. With high res textures and AA, you use a ton of VRAM for storing all that and frame buffers. To elaborate a bit, you store all used textures in your VRAM, plus rendered frames (and probably some other stuff). Usually your GPU pre-renders a bunch of stuff, you you'll have maybe 2-3 frames stored. AA takes samples on pixels and compares to nearby pixels and tries to blend them to reduce "jaggies". Higher levels of AA kill GPU performance, but they also can use more VRAM. In particular, Super Sampling actually renders the frame at really high resolution (like 4x), samples it, blends it, then scales it back down to your output. This of course is the most demanding (and best) form of AA.
On the other hand, you need to have a powerful GPU to even use a lot of AA without killing performance, regardless of how much VRAM you have.
Personally with OCed crossfire 5850s, I do run into the 1gb barrier (1920x1080). Particularily with Skyrim and HD mods plus some config tweaks, the game easily maxes out my VRAM and I need to tweak and remove mods to get it reasonable again. BF3 tends to run great at Ultra 2xMSAA on the original maps, but the Back to Karkand maps for some reason hog more VRAM and I run them at 0xMSAA and High texture size. Even then I sometimes see the VRAM bottleneck. Consistenly running at 90% full although my FPS average is above 70, the issue is that when the VRAM needs to swap textures or purge stuff you get stuttering.
Like rockdpm said, 42" screen or 22" screen makes absolutely zero difference to the GPU. What matters is pixels, and your screen is 1920x1080.
let's put it this way, If you plan on using the card for 1-2 years, buy a 1 GB ram since you probably need to change it for new generation.
If you plan on keeping it for 5 years, better get a 2 GB since most likely games will use it more then... but then again you might have to change your full configuration to get the full extent of the technology
Playing Skyrim @1920 x 1080 with GTX 560 Ti 448 Core 1.2GB, I noticed the memory utilization always hover above 1GB. It seems that 1080 display wants more than 1 GB memory, if you are playing Skyrim @ ULTRA.
It looks like the 2GB wins out in this race of "future-proofing" - so to speak; future technologies aside for the immediate moment of the next 2-4 years spanning.
On my current GTS 450 1GB (no OC);
-I play Skyrim V on ultra, 1920 (the cut scenes will click once in a while - but gameplay is smooth - 35+/- FPS)
- GTA IV @ 1/3 overall max settings @ 1920 29 +/-FPS
- Saints Row - The Third max settings @ 1920 42 +/-FPS
- Metro 2033 max settings @ 1920 (EVERYone here @ TH told me I would not be able to play this game with this GPU) 28+FPS
- Lost Planet 2 max setting (minus shaders to minimum - no Vsync) 27+FPS
put it this way - everything I throw at my GTS 450 plays on max settings and looks and plays beautifully. The most beautiful games I have played to date were Dirt 3, Battlefield 3, Medal of Honor 2010 and Call of Duty MW3.
I am just looking to be sure this 560ti is going to be worth the $30 extra bucks for a 2GB vs the 1GB and produce 60+FPS for the above games I have mentioned. And it looks as if this will in fact do it for me thanks to all of your wonderful information and knowledgeable insights.