Sign in with
Sign up | Sign in
Your question

Do in Need more Than 2Gb vram for 1080p Gaming ?

Tags:
  • Gtx
  • Gaming
  • Graphics
Last response: in Graphics & Displays
Share
October 11, 2013 4:50:05 AM

Hi guys
i wanted to buy Gtx 690 but then i realized its not 4Gb its 2x2 Gb which makes it 2Gb i guess

so do i need more than 2GB for 1080p games in highest setting ? or its enough for like 3-4 years to play games in highest setting ?

More about : 2gb vram 1080p gaming

October 11, 2013 4:52:04 AM

to play on high, no. but to not have problems in the future, or if you will ever want to upgrade to QHD, you will want 3 or 4GB
October 11, 2013 5:01:47 AM

Luka Prebil Grintal said:
to play on high, no. but to not have problems in the future, or if you will ever want to upgrade to QHD, you will want 3 or 4GB



when you said future , you meant 3-4 year or 10 year ?
Related resources
a b 4 Gaming
October 11, 2013 5:43:36 AM

For the next few years, 2GB wont be a great bottleneck for 1080p gaming. Though games right now are capable of using over 2GB at 1080p (I have seen it on my rig with Crysis 3) so I suggest if you want to max out settings you get something with more than 2GB.
The GTX780 or the upcoming R9 290X I think are good options for you, and both will perform on-par or better than that 690.
October 11, 2013 6:02:29 AM

omidelf said:
Luka Prebil Grintal said:
to play on high, no. but to not have problems in the future, or if you will ever want to upgrade to QHD, you will want 3 or 4GB



when you said future , you meant 3-4 year or 10 year ?

i meant for in like 4 years time, when texture resolutions will inevitably go up, as 4k displays gain in popularity, those 2GB wont be adequate anymore, but if you have plans on upgrading in that timeframe again, go for a 2GB card, as these days, its hard to get it to bottleneck.
a b 4 Gaming
October 11, 2013 6:11:09 AM

Nope, though BF4 is pushing the limits on my machine - 1670MB with Ultra textures and no MSAA... As long as you don't use a lot of MSAA, you'll be fine for a few years.
October 11, 2013 6:24:37 AM

what will happen if a game need like 2.5 gb vram and i only have 2 gb ? lower fps ?
October 11, 2013 6:27:50 AM

you will get a stutter, and then get back in. in essence, the frames will stop drawing for a split second or so. but dont worry, i run a 2GB setup and have no problems with 1080p
a b 4 Gaming
October 11, 2013 7:16:13 AM

Then your FPS will lower as the game engine is forced to directly call upon files from the HDD, which is a far slower storage medium than the VRAM.
October 11, 2013 7:19:16 AM

2GB is more than enough for one 1080p monitor. Very, very few games demand higher than that. You should be more concerned about maintaining a smooth (or playable) frame rate rather than making sure your ceiling is high enough (because 2GB will be).
a b 4 Gaming
October 11, 2013 7:21:56 AM

^ Right now it is, but who can say in a year?
3GB cards are coming down in price, just before I saw a 7950 going for $200, you might as well.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

EDIT: and the OP has the budget for a 690, quite simply with that budget he shouldnt be looking at anything with only 2GB of VRAM. The 690 is just irrelevant in the face of the Titan, 780 and the R9 290X.
October 11, 2013 7:26:56 AM

manofchalk said:
^ Right now it is, but who can say in a year?
3GB cards are coming down in price, just before I saw a 7950 going for $200, you might as well.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

EDIT: and the OP has the budget for a 690, quite simply with that budget he shouldnt be looking at anything with only 2GB of VRAM. The 690 is just irrelevant in the face of the Titan, 780 and the R9 290X.

I hadn't looked at his budget. And I agree with your point that we can't tell what the demand of future games will be. If he could definitely get 3GB, then he should. 7950 is, IMO, the best price-performance card as of the new GPU's launch.

However, one can still manage if not hoping for the best settings in all games with 2GB. Depends on the user I guess.
!