GTX 980 CLASSIFIED or GTX 970 FTW+ GAMING ACX 2.0+ (for 1440p ultra 60fps)1<

Solution
Definitely GTX 980.

Because
GTX 970 only have 3,5GB of VRAM (Primary), + Secondary 512MB.
Which will make you stutter in higher resolution and dual monitor.

Anyway, GTX 980 will simply outperform GTX 970 in every aspect!

11sphere92

Distinguished
Definitely GTX 980.

Because
GTX 970 only have 3,5GB of VRAM (Primary), + Secondary 512MB.
Which will make you stutter in higher resolution and dual monitor.

Anyway, GTX 980 will simply outperform GTX 970 in every aspect!
 
Solution
Neither. I have 970 SLI (overclocked) and run 1440p and in heavy GPU dependent games like Witcher 3 get about 80FPS average in ultra settings with 4xAA enabled. I've never come across hitting even 3GB in VRAM in any games BTW, so it's a non-issue in stutters. When testing in single card mode with a 970, even one like my two that are overclocked to within 90% of 980 performance, FPS dips down into the 40s in minimums.

You will need 970 SLI or a single 980Ti to do what you want. And I personally would recommend a single card solution since the trend tends to be against official SLI support in several game releases lately.
 
The 980 is dead in the water.... since the 980 Ti came out, it has been selling poorly. The Classified is special in that it is a very high end design but to be frank, since the 7xx series, due to the voltage limits nVidia imposes on their partners, it's not bringing home more significantly more fps

What does "GTX 970 FTW+ GAMING ACX 2.0+" mean ?

Are you planning to SLI a 970 FTW with a 970 ACX ? . Twin 970s toast the 980 by 40%. You're not going to get that 60 fps mark in today's AAA games like Witcher 3, Dragon Age and Tomb Raider w/ a single 980 of any sort.

tombraider_2560_1600.gif

dai_2560_1440.gif

witcher3_2560_1440.gif


nVidia even lowered the throttling point of the 970 to 80C even tho it has a max temp of 98C... whereas the 980 and Ti (92C max temp) throttles at 85C. Why throittle 7C below on one and 18C on the other.... because it gets too close in performance to the 980 without that artificial limit and there would have been no reason to buy a 980

And no, as has been well established by dozens of test sites, there is no issue with the 970s at 1440p.

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Thing is, the quantifying fact is that nobody really has massive issues, dozens and dozens of media have tested the card with in-depth reviews like the ones here on my site. Replicating the stutters and stuff you see in some of the video's, well to date I have not been able to reproduce them unless you do crazy stuff, and I've been on this all weekend

At the best settings and WHQD we tried, Alien Isolation, Alan Wake, BioShock Infinite, Hitman, Absolution, Metro Last Light, Thief, Tomb Raider, Asassin’s Creed Black Flag ..... Let me clearly state this, the GTX 970 is not an Ultra HD card, it has never been marketed as such and we never recommended even a GTX 980 for Ultra HD gaming either. So if you start looking at that [4k] resolution and zoom in, then of course you are bound to run into performance issues, but so does the GTX 980. These cards are still too weak for such a [4k] resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970.

So the two titles that do pass (without any tricks) 3.5 GB are Call of Duty Advanced Warfare and of course that has been most reported to stutter is Middle Earth: Shadow of Mordor. We measured, played and fragged with COD, and there is just NOTHING to detect with the graphics memory fully loaded and in use.

Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.

So yes, if you do some "freaky stuff" you van **create** an issue, it's hard work and it is not something you will experience in normal gameplay.

The other thing is that peeps use GPU-Z or some other utility and read that the system has allocated X GB or RAM and assume that that number is "real", it's not. If you open a program in Wondows, the system might allocate a certain amount of RAm which is "allocated" or set aside for the program's usage, that does not mean that it ever gets used, the amount allocated depends on the amount of free system RAM available..... so if when opened, there is 6 GB available, it might allocate 4 GB ...

http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

GPU-Z: An imperfect tool

GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”