What effect do you guys think that the next-gen consoles will have on the hardware utilization of future games?
Specifically, the use of VRAM on the GPU (and memory bandwidth).
I plan on buying a new GPU in two months, at the moment my choice is a GTX 770 (or maybe some high-end AMD 8xxx, but I've had much better experiences with Nvidia in the past). But i'm not sure if it's going to be a 2GB or a 4GB model.
I'll be playing it on a 1080p monitor.
Now, I know, I know I know I know, 2GB is supposed to be more than enough for a 1080p monitor, I've read that a hundred times (okay, that might be a slight exaggeration).
But don't you think that since the next-gen consoles (I'm not talking about the Wii U) will use 8GB of unified memory, it will lead to game-developers being more memory hungry (higher quality textures for example)?
Please feel free to correct me if I'm wrong.
I'm a graphics freak, I just want a game to look the best it can possible be.
And for those who like specs:
CPU: intel core i7 3960x
CPU cooler: Corsair H80
Motherboard: Asus P9X79 pro
GPU: Asus GTX 560Ti (1GB)
RAM: 16GB (4x4GB) Kingston HyperX @ 1600MHz
SSD: 120GB OCZ Vertex2
HDD: 2TB WD green (5400 rpm)
PSU: 650W Corsair TX650M
I've always had better experiences with nVIDIA also. And I've not read that hundreds of times. I've read it 80 billion times =^b But seriously I hear it all the time :-) I know Max Payne 3 uses 2030MB when maxed at 1080p. Bit close for comfort for my liking. My feeling is that 4GB is more than is needed but 2GB isn't quite enough for a bit of future-proofing. Since there's no 3GB option, I'd take 4GB if it's not gonna cost more than maybe 50 extra over the 2GB version.
My guess is that the main difference will be better multithreading and fewer poor console ports.
I'd actually be speculating that AMD is going to have significant performance advantages over nVidia cards for the next few years, because everything's going to be built for them (both X1 and PS3 are AMD CPU/GPU).