Highest Quality GTX 980

Seraph21

Distinguished
Mar 24, 2014
51
0
18,530
Looking for the highest quality 980. I am going to be water cooling these cards in a dual SLI configuration so I dont really care about the stock coolers all that much. I want the highest quality as far as the parts go, price doesnt matter.
 
Solution
^ anyway, back to your actualy question, since you don't rely on the coolers, and only need the pcb, a galaxy hof, or a evga classfied will do, both have great overclocking potential and have coolers for them that fit
the classified or lightning for sure are the most advanced models built on the highest possible quality. one of the two will hold the single card world record after they go under the knife, solder, and ln2.

msi might be skipping gm204 lightning and instead waiting for gm210 since they got screwed over so bad last time around this rodeo.
 

frag06

Honorable
Mar 17, 2013
1,353
0
11,960


Yep.

MSI reps have stated that they are not making a Lightning GM204 model (from what I've seen on OCN), so they must be saving it for GM200.
 


yeah honestly if your going to go that extreme, you really should wait for gm210 as well. cards like the kingpin 780ti were able to hit 1500mhz under an ek block with 240 rad and a d5 pump. effectively the kingpin at those speeds was about as fast as reference sli 770s in gaming fps(considering sli performance is about 92% at best).

its always a game of chasing the best, but i personally would hate to look at single card overclocks less than a year later nearly matching my sli setup. but if money is no object you can just sell the 980s and blocks and keep your water setup for gm210 cards when they come out if you choose. but its generally not adviseable to waterblock non flagship chips.
 

Seraph21

Distinguished
Mar 24, 2014
51
0
18,530
Let me ask you another question since I have you guys here. Like I said, Ill be running these cards in SLI with waterblocks. My question is how well will these run a 2-Way 4K monitor set-up with full on graphics settings?
 

Seraph21

Distinguished
Mar 24, 2014
51
0
18,530
I dont think I have heard anything from really anyone about the Inno3D cards. And what is the biggest issue with trying to run 4K with a decent computer set-up. Is it just a graphics bottleneck or are the drivers not caught up or what?
 

Seraph21

Distinguished
Mar 24, 2014
51
0
18,530
That, to me, is crazy. I mean these cards are not weak rendering machines and when you SLI them, either 2 or 3 way, its crazy that they cant run higher than they do. I thought maybe it might be something on the game designers side or drivers more than the actual hardware. I know how crappy Rome Total War 2 is a crappy written game which makes it difficult to play.
 
no, it's just most people are on haswell refresh, with cou's such a i5 4690k or 4790k or AMd equivalents and the boards that can house these can oonly run sli at x16 x8 or sometimes even only x8 x8, thus not having the best power from the dual cards. Now, broadwell, their boards can actually utilise both the cards at their full potential, at x16 x16, so it's not the cards themselves, but teh boards not being able to take advantage of their power
 


what? no, ofc not. But just saying, that boards don't tend to use all of the power of the gpu so holds it back in 4k when being sli'ed. Which is why a lot of wealthy gamers just get broadwell, for the x16 x16, to use their gpu's at their maximum potential, the vii hero uses sli at x16 x8 or x8 x8 i think when being sli'ed
 

frag06

Honorable
Mar 17, 2013
1,353
0
11,960
What? Broadwell hasn't been released yet, are you talking about Haswell-E?

And like I said before, there is virtually no difference between SLI at x8 and x16. You only need the extra PCI-E lanes that Haswell-E offers if you are going tri or quad SLI.
 


^ yes, thank you for correcting me. Why would a x8 just be as good as a x16 in sli? please explain or have a source
 

frag06

Honorable
Mar 17, 2013
1,353
0
11,960


Because current GPU's aren't powerful enough to take advantage of the full bandwidth that PCI-E 3.0 or 2.0 x16 offers. Look here for a comparison.