GTX 980 TI SLI 4K Gaming

GamerKing

Reputable
Sep 3, 2015
1
0
4,510
Hi Guys,
Im planning to upgrade my current custom gaming rig in order to play in 4k. My current setup is the following:
i7 2600k, 16gb 1600MHZ, GTX 590, 1000w Power Supply

My initial idea was to buy two GTX 980 TI's though Im unsure whether my current CPU or Ram might end up bottlenecking the cards as they are a bit outdated. Thanks in advance!
 
Why does it matter? If you're intent on buying the two 980 Ti's in SLI, simply buy and game on them. If your CPU maxes out then you know you'll need to upgrade it. If that happens I'd suggest the X99 platform and a i7-5820K processor. I have that and it's great. The amount of RAM you have is great, and your wattage is good. Would you mind stating the mfg/model of your power supply? My regardless, that's my recommendation. I'm super excited for you. Second question out of curiosity, what resolution are you planning on gaming at and what games do you play? I have a 3440 x 1440 @ 60 Hz and only have one 980 Ti. But I have plans to add a second one within the next month or two.
 
1. Two 980 Ti's don't cut it at 4k ... at least of your are looking for consistent 60 fps @ max settings. If that's the case, so no matter what else you have, it won't matter.

2. If using 2 GFX cards, in gaming, 4790k (or if you wanna jump in early, the 6700k) has an edge in fps. If you go to 3 or 4 GFX cards, then the 40 lane 5830k w/ X99 is the way to go.

3. At this point GFX card technology supports 1440p quite well.... 144 Hz monitors run just great starting with a pair of 970s and w/ 980 Tis you can pretty much run in ULMB mode consistently. Was at my son's house last week where they have twin 970s with an Acer Predator 144 Hz, IPS, G-Sync , ULMB. The other machine has twin 980 Tis on a curved 3440 x 1440p. The 980 Tis edge the 979's in fps, but w/o ULMB the motion isn't quite as smooth. I was surprised, the Predator is the 1st IPS monitor that I truly found impressive.

I won't think about 4k until technology arrives that can fully support it at 144 Hz. Even if there were cards that could do that, no cable exists that could deliver it to the monitor. I don't expect we'll see 4k "getting there" to Xmas 2016.
 

atheus

Distinguished
Aug 2, 2010
669
0
19,160
rcald's response is dead on. You'll be out $1300 or so for your twin 980 Ti's. If you wind up having to update your CPU/Motherboard/Memory you're looking at another $700-800. It's not nothing, but compared to the cost of the cards it's not too bad.

As far as predicting whether your CPU/mobo will be a bottleneck for you is pretty hard to call. You're probably on a motherboard with PCIE 2.0, which means that your x8 slots are only equivalent to x4 slots on today's boards. From what I have seen the GTX Titan SLI was not affected by 2.0 vs. 3.0 at x8, but I don't know if that still applies to the 980 Ti which is quite a bit faster.

Apologies, OP, but I have to ask rcald2000: are you having trouble keeping up 60 FPS with your single 980 Ti at 3440x1440? I find that a bit shocking.
 
The 980 Ti, overall, is certainly not "quite a bit faster" than the Titan X ... comparing reference and reference, it's about 3% slower.

perfrel_2560.gif


There non reference aka overclocked cards certainly do better but both cards can be overclocked. EVGA's Titan for example is sold at a core clock of 1170 .... MSI's 980 is sold at a core clock of 1178

As for maintaining 60 fps .... Wither 3 w/ 1 980 Ti gets 46.7 fps at 1440p.... about 35 at 3440 x 1440.

GTA V is 48.6 @ 1440p ... 36 at 3440 x 1440

On the curved twin 980 Ti rig described above, Witcher 3 at max comes in about settings does about 53 fps.
 
You can still buy Titan's ? ... haven't seen one in ages. When I hear Titan, I (like most I think) usually assume we're talking about the current offering.... I wish they'd call them Titan 7 and Titan 9 whatever so that we can distinguish them easier. After all Titan X shares the same GM200 processor with the Titan X

If 900 series is faster than 700 series, then I'd think Z would be faster than X right :) ? And where they heck did "Black" come from ?


 
Whether or not 2 980Tis can maintain a stable 60FPS full maxed settings on 4K may be debatable, does it really matter if he has to tone down settings even a tad? It's certainly the best option for 4K gaming in my eyes, except 4 980TIs. I don't think he'll be dissatisfied with performance at all.
 
Yes, that will depend upon the individual ... but that assumes there is some value in going to 4k and I just don't see it. Give the choice of a free 4k @ 60 Hz versus 1440p at 144 Hz w/ ULMB, I wouldn't even think about fps / settings. (Just to be clear enabling ULMB drops ya to 120 HZ).

"Hey this is 4k and I wanna be part of your life"

"Call me when ya can do 144 Hz and Display Port 1.3 cables arrive."
 


I do agree. I run 1440p with a 750Ti, and I think 1440p is the sweet spot personally. I can't see any pixels, the PPI is perfectly high.
 

atheus

Distinguished
Aug 2, 2010
669
0
19,160

Oh, you know, there's always a black model. Intel will be making a "black" something or other any day now. I think they're the last ones to join the black party.

Yeah, I wish there were a more recent comparison to see, but the only one I could find on PCIE 2.0 vs. 3.0 was from back in the day when there was but one GTX Titan.
 
@Atheus, In answer to your question, do I have trouble maintaining 60 FPS @ 3440 x 1440 with a single 980 Ti, I don't know. For example, when I run the GTA 5 benchmark, I'll seem performance between 30 - 60 FPS. Please keep in mind that I've not overclocked my 5820K nor my EVGA 980 Ti 4992 (reference cooler). The reason I haven't yet overclocked my 980 Ti is because until two weeks ago, I was gaming at 1080p @ 60 Hz. The reason I haven't yet overclocked my 5820K is because I came across a peculiar problem with my memory is because of the problem I've described below.

@Atheus, now I have a question. How do I view the CPU and GPU utilization during game play? I see people with these tools all the time during Youtube and Twitch game play but I'm not sure how they do it. If you tell me how, I can get a better idea of my performance.

*** build ***

EVGA GTX 980 Ti | Intel 750 AIC 800 GB | EVGA 1200 P2 | i7-5820k | Corsair 64 GB 2400 DDR4

*** problem ***

I installed this g-skill 64 GB kit in my computer, and logged into the BIOS. All 8 DIMMs initially appeared at 2133 MHz, which I believe is normal. I enabled XMP mode, and that's when the problem started. Only 6 out of 8 DIMMs were registered. The other two DIMMs shows as NA. I opened ticket with both Asus and G.Skills. Asus technical support asked me the brand and model of memory. Turns out that model # F4-2666C16Q2-64GRB is not supported in XMP mode, according to the QVL (Qualified Vendor List)

*** qualified vendor list ***

http://dlcdnet.asus.com/pub/ASUS/mb/LGA2011/SABERTOOTH-X99/SABERTOOTH_X99_DRAM_QVL_20150108.pdf

* Most people aren't aware of this issue because most people don't install 64 GB of ram. There are only six models, not manufacturers, of memory that support 64 GB of memory in my Asus Sabertooth X99 board. Of those, one of them is a model that doesn't exist on NewEgg, four of them are extremely expensive, and the last one is reasonable and available on NewEgg. That is the memory that I purchased to replace the g.skill:

CORSAIR Vengeance LPX 64GB (8 x 8GB) 288-Pin DDR4 SDRAM DDR4 2400 (PC4-19200) Memory Kit Model CMK64GX4M8A2400C14

* It's currently listed on NewEgg for six hundred dollars (less one penny). I ordered it on 8/28/15 and then 30 minutes later it sold out. This Corsair memory sells out on NewEgg all the time, but just set an auto-notify and also look at other sources.

 
@Turkey, i wasn't talking smack. I'm honestly fascinated. You should consider making a YouTube video with those specs. If you monetize, you might actually make a few extra dollars. I doubt there are many people with a 750 Ti playing 1440p. I have a friend who posted a video playing GTA 4 with a MTA bus (New York) mod and makes $400 - $500 per month. Just something to think about.
 


I think you read the wrong response :p my "what one earth..." response was in reply to jerdle not you. And you're right, I should make some vids. I always laugh when people say things like, "A 750Ti is too weak for 1080p". I know the games I play are a few years old, but realistically one can always turn down settings. Some people just hate turning down settings though, I don't know what it is. Low settings these days is just as good as the ultra back in the day.
 


I think you read the wrong response :p my "what one earth..." response was in reply to jerdle not you. And you're right, I should make some vids. I always laugh when people say things like, "A 750Ti is too weak for 1080p". I know the games I play are a few years old, but realistically one can always turn down settings. Some people just hate turning down settings though, I don't know what it is. Low settings these days is just as good as the ultra back in the day.

Edit: Why are my posts being doubled?
 
No offense taken on either side. I wish I still had my 970 to give to you. I exchanged it to Amazon when I upgraded to my current card. That's awesome that you can push that card to achieve so much.

Anyone know the name of the utility to display CPU and GPU specs in the upper left hand corner? Is that a component of shadow play?
 


1. I don't live in that universe.

perfrel_3840.gif


2. The FuryX overclocks a whopping 5.1% (108.1 / 102.9) over the reference card
http://www.techpowerup.com/reviews/AMD/R9_Fury_X/34.html

3. The 980 Ti overclocks 31% (134.8 / 102.6) over the reference card
http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/33.html

So with the stock FuryX matching the 980 Ti at 4k when you eliminate the games that it has trouble with, we have a card that overclocks 31% versus one that overclocks 5% .... the 980 Ti OC'd is 24 % faster than the FuryX OC'd.

If this was a horse race, when 980 Ti crossed the finish line of the Belmont Stakes, it would be a win by 191 lengths.

 


I don't know if it's simply a manner of games getting exponentially more graphically intense in these last few years or what, but the games I play are great and I max them all out. Modern day AAA games don't look 3X better to me than games of 3 years ago, yet they seem to be that much more rough on the GPU. It all started when the Xbox One and PS4 came out it seems.
 

atheus

Distinguished
Aug 2, 2010
669
0
19,160

Probably AIDA 64. Regarding all the other questions, I'm afraid I'll have to hold back on answering them in this thread.