The problem with VRAM is that the majority of folks (users and reviewers) who use GPUz to prove how much VRAM the game needs think that hat you see in GPUz is what the game is using or needs.... hence the misconception that we all need oodles of VRAM for low resolutions.
1. GPUz does not tell you how much the game is using or who much it needs. Kinda like when teenager asks Dad for gas money and he responds by asking "How much ya need ?" The answer is always exaggerated. A better analogy is when you go for a car loan and the bank does a credit report. You'd say that you owe $500 on your credit card ($5,000 limit) the agency says, you have a $5,000 liability. Here's the technical explanation:
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
If ya read the article, they show that no game, even at 4k suffers from having less then 4GB of RAM. Not that it couldn't benefit from having it, but at the resolution and settings that you'd need to use to break the 4 GB barrier, the GPU is simply not capable of delivering the frame rates necessary to break 30 fps. If ya can't play it, what does it matter that if you could play it, you'd benefit from more RAM.
The subject comes up again with each new generation but no one seems to take notice
6xx series https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
9xx series
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Yes, they are old cards and in some cases old games but some of those games are as demanding as current nes and b) ... look at the resolutions they are using. Aleinbabeltech tested 40 games with the 7xx and they couldn't find significant differences up to 5750 x 1080. You can search youtube for there test but it's in russian or something, tho you can read the charts.
The have been exceptions ... AC:Unity was an extremely poor console port that sucked up RAM like crazy. Newer games in DX12 are showing performance gains with more RAM but, at this point, is that driver immaturity with respect to the new API or an actual VRAM issue... yes you might be able to run out of RAM loading the high resolution textures that come with games designed to use these with 2k and 4k monitors, I don't really see the logic in using them on a 1080 screen.
Perhaps I am jaded by this issue being bandied about thru the 6xx, 7xx and 9xx series and never seeing it survive responsible testing. At this point, especially with new APIs in the picture, it's hard to draw conclusions whether it's the game engine, poor API implementation, or something else.
The 6GB choice is better in the sense that the likelihood of a move to 1440p is a reasonable expectation. But if money is tight and you know that not escaping 1080p for the life of the build, than I'd have no concerns about 3 GB.