Be aware that there is no way to actually measure what VRAM is being used. Its kinda like ya credit rating, when ya get that credit card with a $5,000 limit, and you owe $500 on it. When you apply for a car loan, the amount of liability reported by the credit agency is $5k...even tho you only "use" 500.
Same thing... a game looks at what's available and says OK, you got 8, so give me 5 ... "just in case I need it". Alienbabeltech 1st made this case when they tested 2 GB and 4 GB 770s and could find no difference in performance between the 2 GB and 4 GB models at 5760 x 1080 w/ 40+ games. Max Payne wouldn't even install ... until after the installed the 4 GB, the install allowed the setting at 5760 x 1080 ... then the swapped in the 2 GB and it ran at same fps and same image quality. The only performance differences they found was that the 4 Gb outperformed the 2 GB when settings were so high the game was unplayable (well under 30 fps) Extremetech, and others, have repeated this testing over the years with the same results. To date, the only cases where they have been able to observe a problem is when:
a) Running a high demand game at 4k with settings maxed, they were able to observe performance differences between 4 GB and 8GB but **only** when you turned settings up so danged high that the game was simply unplayable either way. Does it really matter if 8GB gets you 22 fps and 4 GB only 17 fps ? This confirmed what Alienbabeltech found with the 7xx series cards year before
b) Working hard to create situations that lay way outside the realm of normal usage. This was common when folks tried to make an issue about the 970s 3.5 + 0.5 GB arrangement but got dispelled when test sites tried to duplicate the problems but couldn't w/o trying really hard and when they did, the 4GB 980 produced the same result which again blew the deficient 970 claim outta the water.
http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html
If one is assuming that they can judge what VRAM is used by a particular game by firing up a utility like GPU-Z, they have been misinformed.
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
GPU-Z: An imperfect tool
GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
In GTAV, we see that at no time does the game even**allocate** above 4 GB of RAM and while the Titan and 980 Ti kick butt at 1080p and 1440p, the Fury X has the higher performance at 4K, despite only having 4 GB
https://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/4
So how can we determine how much VRAM a game is using... we can't because there's simply no tool available which provides this information. We do know that any assumption or VRAM usage based upon what GPU_z or any other utilities tell us is not "real". We do know that claims of "XGB us not enough" have been proven wrong time and time again when actually tested .... games reported to need more RAM turned out to have no observed performance differences in fps or image quality. Alienbabeltech refuted this in with the 7xx series, extremetech and others refuted it with the 9xx series (two more below)
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
there are games that break this mold ... notably poor console ports which consume VRAM like free peanuts in a beer hall.
Here's GTAV with twin (*3.5* GB 970s) up against the 1070 ... same 5% difference
Now all that being said,
I am **not** saying VRAM makes no difference... I am saying that reviewers who test w/ GPUz and then claim a certain card is no good at a certain resolution because they tested a card with 8GB and the card **used** more than 4 GB are simply being misinformed because the utility they are basing these claims on is not capable of reporting what they think it is.
But here's a good way to look at this.... the 1060 3GB and 1060 6GB differ by more than the amount of VRAM, the 3 GB model has 1152 shaders and the 6 GB has 1280. With a different amount of shaders the 6GB should be faster than the 3GB. So how can we compare performance between 3GB and 6 GB 'all things being equal. Let me propose the following. I don't thing anyone would select a card for GTAV at 4k and it stands to reason therefore that as we move from 1440p to 2160p (4k) the advantage of the 6 GB card over the 3 GB card should widen substantially. We'll rule out 1080p because in think all would agree 3 GB is enough for 1080P and the factory OC will have an impact here.
So looking at the performance of the 3 GB MSI card in GTAV, we see that the 6GB reference card delivers 72.7 fps to the 3 GB MSI card's 69.9 which gives the 6GB card a 4% speed advantage. Now if we try and take this to 4K, logic and GPU_z) dictates GTAV **needs** more than 3 GB VRAM to perform at 4k, we should see this lead widen substantially. It doesn't.
The 6 GB card delivers 36.1 fps, the 3 GB card delivers 34.6 fps... the exact same 4% advantage .... clearly, GTAV has no issue with 3 GB of RAM. Going to 6GB to from 3GB clearly brings nothing to the table in this game .... even at 4k. Now if we look at later releases, we will start to see impacts. ROTR for example shows a huge impact from 3 Gb to 6GB. However, the 6GB card is 32% faster than the 3 GB card at 1080p and yet the 6GB card is only 15% faster at 1440p.