How many 1080p and 1440p displays can 4GB of VRAM power?

Solution


I guess you didn't click on the link showing several benchmarks with similar results. VRAM is not as big of a factor right now because the processors aren't fast enough to push the pixels very well. Maybe once GPUs speed up, but the mere fact that games can use a lot of VRAM doesn't mean they need it for sizeable improvements.

bioshock4k.jpg


Crysis34k.jpg
...

emdea22

Distinguished
according the Shadow of mordor which requires 6GB for 1080p and ultra textures - none
If you plan on playing quake 3 then sky is the limit - as many displays as you can hook up (up to a point ofc)

You see where i;m going with this? It depends on the game and game settings. If you are talking about most AAA games of 2013-2014 then you can probably run 2x1440p monitors at high-ultra textures BUT thats only regarding the memory size. Depending on settings you might not even get 30fps in some games.

Personally i would not run more than 1x 1440p display but you can also run 2x1080p displays although i don;t see why you would do that. Regarding general windows productivity you can probably have 6x 1440p monitors as long as the card supports it - i;m not too sure how many monitors it can actually use depending on number of display port / hmdi/ drivers etc.
 

Eggz

Distinguished
You can connect up to 4 displays running up to 4K on the gtx 970: http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

It's not powerful enough to run demanding 4K games very well, but nothing is.

VRAM won't limit you on that card. Before that becomes a factor, the processor will become unable to keep up. Don't worry about it. Having 4GB will be good for anything you'll realistically want to use that card with.
 

Legolas8181

Distinguished
Nov 18, 2013
730
0
18,990


The most I reckon I will want to do is have a game up on a 1440p display
 

emdea22

Distinguished


You are wrong. And i;ve proven people wrong on this subject time and time again. Most games are ports from PS4 and Xbone that have 8GB of vram. Companies with experience and a lot of money can optimize ported games to use some VRAM and some system RAM however more and more games will require 6 to 8gb of texture memory. Take Shadow of Mordor for example - relatively small budget - needs 6gb ram for ultra textures at 1080p... Watch Dogs needs 4gb; CoD:AW needs 4GB ; and we are only scratching the surface of the so called "next gen" games. Expect 8GB VRAM requirement in the next year...

Theres a reason why high-end GPUs with 8GB vram started showing up. Keep in mind compressing textures still look crisp however color quality is suffering. So mostly these high req come from textures that are not that compressed.
 

Eggz

Distinguished


You'll be fine. Just avoid downloading super-duper ultra texture add-ons, and you'll have great detail with high frame rates :)



Damn, someone's touchy :ouch:

You totally don't need that much VRAM. The 780 ti, which only has 3GB of VRAM, can run 4K games on par with Titan and the 980, which both have more VRAM - 6GB and 4GB, respectively. Games like Battlefield 4 CAN USE more than 3 GB (click for video showing BF4 using more than 5 GB of VRAM), but it doesn't effect performance noticeably. Look it up, and also check out the graph below, particularly the single card performance between the 780 ti and Titan Black. You'll see that performance was the same even though the Titan Black has double the VRAM.



Also, you're talking about using settings that have no perceptible difference. Here is a report from PC Gamer, which threw 4 GTX Titans - each with 6 GB of VRAM - at the Ultra texture pack in Shadows of Mordor. While the game was playable with 4-way SLI Titans, they couldn't even tell the difference, and they used $4,000 of GPUs in order to sun something with no perceptible improvement in textures:

 

emdea22

Distinguished
If i see another Battlefield 4 Benchmark i'm going to shave the dog i don't have! Its like saying one game sets rules for all the others.
This thing with "couldn't tell the difference" is such an old console peasant excuse is laughable. Its been used to justify gaming at 30fps and 720p with low textures because, hey - "you just can't tell the difference". Thats just bollocks.

Times have changed and while your argument would have been spot on a couple of years ago, its no longer true today. Vram requirement has spiked and will keep raising until it hits the 8gb mark.
 

Eggz

Distinguished


I guess you didn't click on the link showing several benchmarks with similar results. VRAM is not as big of a factor right now because the processors aren't fast enough to push the pixels very well. Maybe once GPUs speed up, but the mere fact that games can use a lot of VRAM doesn't mean they need it for sizeable improvements.

bioshock4k.jpg


Crysis34k.jpg


heaven4k.jpg




If you can tell the difference, then by all means, go ahead and spend $3,000+ on graphics cards alone. Also, no one is talking about 30 fps at 720p - not sure where that came form. My point is that having 4 GB of VRAM per card gives you adequate memory for high-end use. I'm not saying that VRAM doesn't matter at all, just that it's not as critical as you're making it out to be after a certain point. For instance, Nvidia's current flagship card (GTX 980 w/ 4GB VRAM) will give people a very good experience. To say otherwise - which you are doing - is to say that no card exists that can give a satisfactory experience because VRAM is such a limiting factor. If there's any truth to that whatsoever, you're vastly overstating it. The data fully supports that.
 
Solution