ThatNoob90

Honorable
Apr 18, 2012
11
0
10,510
Hi there! I've built a pretty lovely system and before I order everything I had to ask, I'm starting with one 680, and eventually I'll buy the second for SLI, until I've got the second, how would I go about doing a triple-screen setup? Do I need a TripleHead2Go? I'm really curious about this and I can't seem to find a definite answer. :(
 
Solution
Or as the OP said buy one and add a second for SLI later when you get the chance, money, etc. That's the reason I didn't bother warning him about possible "eyefinity" issues. He just asked how to hook them up. He seems to understand any possible horsepower issues by him using the SLI comment. I'm not sure what jaquith's issue is other then he seems to be a jerk as shown in the IB sticky.

For the OP you should have your answer by now. Congrats on your card and have fun. Sorry to everyone else for my "flowery & rose colored BS". I didn't realize that running two of the fastest available cards was such a bad idea. (and again I'd like to point out that the OP NEVER mentioned 3D, that comes from someone else...)

ThatNoob90

Honorable
Apr 18, 2012
11
0
10,510
So as long as they're DisplayPort, HDMI, or DVI I can? And if the answer to that is yes, then.. Do I connect two via DVI and one via HDMI/DisplayPort or..? >.> I'm sorry lol
 
For 3xHD 5760x1080 (5900x1080 bezel) you need at least 2-WAY SLI GTX 680 if not 3-WAY for very high details. Also, there are 4GB versions of the GTX 680 coming out 'soon'. While the vRAM is not so much and issue on e.g. 30" 2560x1600 monitor on 3xHD it is an issue for several games especially with high details; vRAM Bottleneck.

edit/BTW - if you're looking to do 3D Vision then at least 3-WAY SLI GTX 680 if not 4-WAY for very high details. On many games we cannot play some games in 3D with decent details, and vRAM Bottleneck is an issue. Running in 3D you lose roughly half your frame rates.

It depends on the games you play.
 

varasha

Honorable
Apr 18, 2012
1
0
10,510
I run 3 screen no problem, you just need to use Surround in the Nvidia control panel, works like a charm, i use dvi dvi and hdmi ports. It basically makes it so your desktop is one big screen. I play everything on ultra with it NP works great
 

At first I thought that was what he meant, but I realized that he is talking about the power needed to run all games at "very high" settings, at least I think so.
 
Yeah sure it does...try it - totally unplayable frame rates! I've been running 3D Vision since it came out. Look at others running the same resolutions and what they think.
 

I've seen some my games drop to 8 FPS and total chop even with 3D Vision off which is exactly why I'm waiting for the 4GB GTX 680's and the plan is 3-WAY SLI.

I must note, 3D Vision requires 120Hz monitors, but that's a separate issue.
 

4745454b

Titan
Moderator
I know he's talking about horsepower, but the OP didn't ask that. He clearly wrote;

For 3xHD 5760x1080 (5900x1080 bezel) you need at least 2-WAY SLI GTX 680

Which is wrong. A single GTX680 is all that's needed. If the OP asked about 3D or even maxing out newer games at that res then I totally agree. But the OP didn't. He just wanted to hook it up.
 

Here's a vRAM bottleneck with 4-WAY SLI with GTX 680's - http://www.youtube.com/watch?v=S0-xcxAvu54
 


I can only assume you are referring to BF3, because I have no game that requires more than 2GB.

I haven't tested a lot of games, but 3D vision doesn't seem to require more vram or very little more than not using it from what I've tested.

I'm assuming the issue you ran into could be more of a specific game issue. I've seen 2D surround tests with BF3 on the 680 and 7970 and they don't seem to mind the reduced vram. It seemed to even run better on the 680 than the 3G 7970.

http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/10
 
Listen, I hate to debate or argue, if you have ANY resolution less than 5760x1080 you won't experience (that I can think of, but I don't run 2560x1600) vRAM bottlenecks. However, the OP is asking about running 3 Monitors so it is an issue on plenty of games.

From your own link, Skyrim (0FPS) - http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/11

Therefore, IMO wait as I am for the 4GB GTX 680(s).

//The quote shows a typo I fixed, I meant 3D Vision and caught the error//
 


It also had a min FPS of 0 with the 3GB 7970. Just because a game hits 0 FPS on the min doesn't always mean it is due to vram. You are making an assumption there.

Skyrim performed so well at 2560x1600 and 1920x1200 we expected it might also do well at 5760x1200. It certainly did that. With both video cards we found 5760x1200 to be no trouble at all. We were able to play with every in-game setting at its highest value and then found we could enable high levels of AA as well.

We found that with both video cards 8X MSAA plus FXAA turned on at the same time was playable on both video cards. This came rather as a shock to us, as we know the GTX 680 has a smaller framebuffer, and less memory bandwidth. Yet, here we were able to turn on 8X MSAA and FXAA. On top of that, the performance was also faster than the Radeon HD 7970.

The GeForce GTX 680 was 19% faster than the Radeon HD 7970 at 5760x1200 8X AA+FXAA, despite the HD 7970 having a memory capacity and bandwidth advantage.

I can only assume that 0 FPS was a zone change due to their glowing review of how smooth it was.

The reason I talked 3D Vision, was in the quote I previously made was about 3D vision, not 2D or 3D surround. Anyways, the 2GB's may become a minor issue in the future, but what I've seen at current, it is not an issue except in an extreme example of BF3 (that didn't show in other reviews).
 
Yeah I know what it says, and when you experience low frame rates you know the things to avoid. There's a difference between 'reading' a review and actually experiencing it first hand - repeatably and annoyingly.

I have no desire to repeat problems and I know on 5900x1080 I will run into vRAM bottlenecks with 2GB -- no doubts.

Therefore, I can tell the OP all kinds of flowery & rose colored BS or warn them ahead of time, and I choose the later. I'd be running 3-WAY 2GB GTX 680's now if I wasn't convinced to wait for the 4GB versions.

The OP has a choice, and the 'good thing' about SLI is you can add a 2nd, 3rd or even a 4th GPU *IF* your MOBO supports it. Personally, in this setup I'd either go for the SB-E/LGA 2011 (e.g. ASUS R4E) or IB/LGA 1155 (e.g. ASUS P8Z77 WS). The day a decent EVGA/ASUS 4GB GTX comes out I'm personally adding '3 to cart', otherwise you 'might' be stuck waiting...
 


There is a difference between two 680's in SLI vs 3 or 4. With 3 or 4, you'll be enabling every last setting and AA. I can see that hitting the limits. With every last item turned up, including AA, VRAM is used a lot more.

That is one cool thing about the hardocp reviews. They don't just pick a setting and test everything, they instead test every setting and show you the best settings possible with the card. If they run into a VRAM issue, they'll show you the best settings to disable and what else to use instead.

So anyways, if you are going extreme, and plan to have everything maxed at 2D surround, then I understand your need for 4GB. If that is not required, you can do ok with 2GB shown by the reviews.
 

4745454b

Titan
Moderator
Or as the OP said buy one and add a second for SLI later when you get the chance, money, etc. That's the reason I didn't bother warning him about possible "eyefinity" issues. He just asked how to hook them up. He seems to understand any possible horsepower issues by him using the SLI comment. I'm not sure what jaquith's issue is other then he seems to be a jerk as shown in the IB sticky.

For the OP you should have your answer by now. Congrats on your card and have fun. Sorry to everyone else for my "flowery & rose colored BS". I didn't realize that running two of the fastest available cards was such a bad idea. (and again I'd like to point out that the OP NEVER mentioned 3D, that comes from someone else...)
 
Solution

ThatNoob90

Honorable
Apr 18, 2012
11
0
10,510
Thank you all for your time, I -was- just asking about connecting them together, as I was a little confused on that, but now if I am not mistaken, NVidia has pretty much designed TripleHead2Go IN TO their card. ;P Thanks! ^^