4-way sli NVIDIA GTX 780 ti

rexusforte

Honorable
Nov 13, 2013
7
0
10,510
Hi guys, i'm planning to build a hardcore PC rig. I'm just wondering if 1 (or 2) Xeon E5-2697 v2 will be totally enough to handle 4 NVIDIA GTX 780 ti cards in SLI mode. Thank you in advance. Hoping to see your replies :)
P/s: budget is not my problem. So... it won't be my problem :sarcastic:
 
Solution
The Xeon is just not a gaming CPU. It is slower than an i7 and doesn't OC like the i7. Games rarely use more than 4 cores, and when they do, 6 is about the limit anyways. All the cores on the Xeon will go to waste. On top of that, games are not optimized with a Xeon in mind.

Now you also must realize that the use of that much power is tied up in the resolution you plan to play at. At 1080p, you'd be bottlenecked in pretty much any game with just 2 780ti's, but with a 4k screen, you'd rarely bottleneck an i5.

That last unknown is will the 780ti even work in 4-way SLI? Officially, Nvidia cards, other than their dual cards (like the 590 or 690) do not support 4-way SLI, though the Titan does unofficially, but many of their cards...

rvilkman

Distinguished
I am a little worried about the per core performance. You might actually be better off for gaming just running a 4960X for example. The hex core clocked properly should be able to handle things just because there is a bit of a limitation on how many threads games run.
 

rexusforte

Honorable
Nov 13, 2013
7
0
10,510


Ummm... actually i have thought about that. But it seems like 4960X (even overclocked) will bottleneck 4-way SLI cards.
 

rvilkman

Distinguished
That might be, but unless the game is able to take advantage of more than 6 threads the the xeon will just make things worse.

Getting a previous generation CPU might help get higher overclocks though, which might again help. so 3960X might be the better of the 2.
 
The Xeon is just not a gaming CPU. It is slower than an i7 and doesn't OC like the i7. Games rarely use more than 4 cores, and when they do, 6 is about the limit anyways. All the cores on the Xeon will go to waste. On top of that, games are not optimized with a Xeon in mind.

Now you also must realize that the use of that much power is tied up in the resolution you plan to play at. At 1080p, you'd be bottlenecked in pretty much any game with just 2 780ti's, but with a 4k screen, you'd rarely bottleneck an i5.

That last unknown is will the 780ti even work in 4-way SLI? Officially, Nvidia cards, other than their dual cards (like the 590 or 690) do not support 4-way SLI, though the Titan does unofficially, but many of their cards are artificially blocked from 4-way SLI.
 
Solution

rexusforte

Honorable
Nov 13, 2013
7
0
10,510

Umm... I'm sorry but this is just a plan :) Even budget isn't my problem, i just don't want to waste them in the worthless places. So i just need your advice & experience to choose whatever suits me the most :) But thanks for the help anyway :)

 

rexusforte

Honorable
Nov 13, 2013
7
0
10,510

Thanks for the enlightenment. But can you explain to me more about the 4-way SLI? My thought is that 4-way SLI will combine 4 cards & make them act like one GIANT, POWERFUL graphic card. Actually i have read some of the posts & they also said that 4-way SLI was not recommended. Then i thought that it might be the CPU which was bottle-necking the graphic cards. So in the end, i just thought that 2 of the most powerful CPU on Earth would solve this problem. But since you said that games weren't optimized for heavy-duty CPUs rather than games-oriented CPUs, I just don't know what will be suitable enough to squeeze the best out of the 4 cards.
You might ask the reason why i'm so obsessive about this things. That's because I want to play hardcore games at their maximum settings (what i mean is REAL maximum settings, things like the highest level of AA, and other settings maximized). Although people have been saying that those settings are just for "feeling" & experiencing, but i still want to actually use those, rather than turning them on, watching them for a moment, "feeling" them, and then turning them off.
Anyway, thanks for the post :) Very appreciate you spending your time :)
 
I do not know if 4-way SLI works with the 780ti. It does not work with the 780 and it is unofficial with the Titan. I do not know about the 780ti.

Now, you still haven't told us your resolution. Depending on your resolution, the 3rd and 4th card may not help, or barely help. Look at these benchmarks, at 1200p, 3-way SLI Titans are barely better than 2-way SLI, but at 1600p, it is a fair bit better. Now imagine adding a 4th. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/22.html

Now about this: "My thought is that 4-way SLI will combine 4 cards & make them act like one GIANT, POWERFUL graphic card."

SLI does not make the cards act as 1 big card. It doesn't make any of them faster, and they do not share vram. Each card is tasked with delivering one frame on its own, and the CPU tasks the next GPU to create the next image, and then it tasks the 3rd card to create an image of its own, and so on and so forth. They do not act as a single powerful GPU. They act as a team of GPU's working independently with their own image.

In most cases a single CPU core is tasked with prepping frames for the GPU's. This is something that is linear, a difficult to multitask. This is why a super fast CPU per core is better than one that has 12+ cores. If that core cannot keep up with 4 GPUs, you are not going to gain full benefits from those GPUs.

If the GPU's take a long time to do their tasks, then the CPU will be able to keep up, but if they zip past each frame, the CPU cannot keep up and slows everything down. Your resolution determines whether or not the GPU's will be too fast for the CPU.

What is your resolution?
 

mindroid

Honorable
Nov 14, 2013
10
0
10,510
Nivida tested the gtx 780 ti and it can handle 4 way sli WAYYY better than the gtx titan. Also I did more research and I found out the the GPU will not effect CPU bottlenecking, in fact having a better GPU releives the CPU's task of rendering. The CPU is not just handing the GPU things to process, they are working together and the GPU is doing most of the work. I will help choose what CPU you should get have.
If you are using the rig to game with, it is better to go with the 4960x since all pc games cannot harrass 12 cores of an Xeon. Ghz is more important at this point since it makes the processor faster.
If you are using it for BETTER 3d rendering and go with the 12 cores of the Xeon

dont hate me but i kinda used wikipedia
http://en.wikipedia.org/wiki/Rendering_%28computer_graphics%29

 

I don't know about the 780ti ability to 4-way SLI, so I will take your word about it, though a link would be better.

You are, however, quite wrong about the CPU not having an issue with bottlenecking. Though it may not depending on the resolution he is planning on.

While the CPU and GPU work together, they have defined tasks. The CPU uses draw calls to setup what to be rendered, as well as handle physics and AI. The GPU renders images on the screen based on the draw calls the CPU presents it. If the CPU cannot setup what to be rendered as fast as the GPU's render them, it will cause the GPU's to have to wait. This is known as a bottleneck.

Go through the benchmarks on the link I gave above, you'll find several games where 3 Titans do not deliver more FPS than 2 at lower resolutions and even at higher resolutions on some. This is an example of a bottleneck. When their is a bottleneck, more GPU's can hurt performance due to requiring extra overhead on the CPU, the thing causing a bottleneck.

Examples:
ac3_2560_1600.gif

arkhamcity_5760_1080.gif

borderlands2_5760_1080.gif

crysis_2560_1600.gif



There are many more in the review here: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/22.html
 

mindroid

Honorable
Nov 14, 2013
10
0
10,510
Thank you, bystander,
I was trying to come at it a postitive way, but yes you are right the GPU can potentially bottleneck the CPU, i think thats what happened with the charts posted cause its a Core i7-3820, kinda old, and more for price, current high end CPUs, will EVENTUALLY bottleneck when better graphics cards come out, but for the time being, they are going to help with 4 gtx 780 ti's
With past games, more sli is worse since just two cards aready brought the game to max, and since the game makers didnt make the graphics good enough to run better than 2 gtx titans the third card weighed the processing down.
By the way, rexusforte, if you just want to run on a single screen, 4 gtx 780 ti's is overkill and the charts above might be the case for you, go for triple screen (7680 by 1600) or 4k. The i7 4960x is good, unless you still would like to use a Xeon.

 


The 780ti's are about as much faster as the Titan as the 4960x is to the 3820 and this is just with 2-way to 3-way Titan. Imagine how often 4-way will cause a bottleneck.

Of course if he plans on a 4k resolution, it will be less often. It all boils down to the resolution he plans to play at. Even 5760x1080p will often not benefit from the 4th 780ti, unless that is in 3D Vision as well.
 

rexusforte

Honorable
Nov 13, 2013
7
0
10,510
Thank you guys for your advice & your time. If then perhaps i'll wait for the Haswell-E or Broadwell-E series to buy. But still, until that moment, NVIDIA may have been released some kinds of monstrous graphics cards. I guess that what i want is not possible (or at least at this moment). Again, very appreciate your advice & your time guys :)

P/s: oh i forgot to mention. I would really really like to play games in 4k resolution. Even if 4k monitors are only limited at 60fps, but G-sync technology of NVIDIA has solved this problem. Tripple 4k monitors would be nice for me :)
 

mindroid

Honorable
Nov 14, 2013
10
0
10,510
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.

In a pointless attempt to protect the revenues of their very expensive professional Quardro GPUs, NVIDIA artifically cripples their Windows (assuming this is your OS) driver to prevent users from using features like Surround 2x1, 2x2 configurations, 10-bit color and multiple display stereoscopic 3D in OpenGL (Quad Buffer Stereo). All Geforce cards are capable of these features and they are accessible when using the Linux GeForce driver but they are artificially disabled in the Windows driver.

Due to silicon limitations, this display requires DisplayPort MST to operate at 4K @ 60 Hz. The display actually appears as two tiled 1920x2160 monitors which is why this monitor is capable of doing 4K @ 60 Hz over 2 HDMI cables.

With the introduction of this monitor, NVIDIA was left with a choice, either support Surround 2x1, 2x2 configurations properly so anyone with any pair of monitors could play a 3D game with any variety of 2 or 4 monitors or they could write some sort of hack in the driver to support these types of monitors specifically while avoiding giving Windows users 2x1 and 2x2 Surround support.

NVIDIA is the lone GPU maker without 2x1 and 2x2 monitor configuration support. AMD has supported it forever with Eyefinity and now even Intel supports these configurations with their integrated GPUs using their Collage feature.

So you can probably guess what NVIDIA decided to do, instead of supporting Surround 2x1 properly, they decided to hack their drivers. They created an EDID white-list so they could detect these kinds of monitors and support their unique 2x1 capability while still disabling general Surround 2x1 support with any pair of monitors.

NVIDIA had an pre-production version of this display and updated their driver based on that. However when Sharp finally shipped this monitor, they changed the EDID data from the pre-production display. This change caused the display to fail the NVIDIA EDID whitelist check and not allow it to operate at 4K @ 60 Hz. So now NVIDIA is in the process of adding the correct EDID data to the white-list in the driver and soon the monitor will finally work with NVIDIA GPUs.

NVIDIA could have avoided all this by just giving everyone proper 2x1 and 2x2 surround support.

This comment is a summary of the following massive 9 page thread with comments directly from NVIDIA confirming the above:

https://forums.geforce.com/default/topic/539645/nvidia-surround/2-monitor-gaming-/1

So, I said it, sorry to ruin your dreams, but its fact and i can't fight that :/
 

rexusforte

Honorable
Nov 13, 2013
7
0
10,510

Thanks a lot for your knowledge :D Well either NVIDIA or AMD is fine with me due to the fact is that i just want to play games :) I said that i would really like to play games in 4K resolution, but if that's not possible, then 2k, or even 1080p will still be fine with me :)
 


That is an old thread. Nvidia does support 4K now.
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-4.html

Notice all the 780ti reviews that are done with 4k? However, if you are serious about 4k Eyefinity across 3 screens, then AMD is the only choice, as Nvidia does not support 6 screen surround. 4k monitors are really 2 monitors in 1, so 3 of them is like having 6 monitors.

Of course 3 4k monitors would be awful for gaming, as it is just too much size to run with the power of current cards.
 

mindroid

Honorable
Nov 14, 2013
10
0
10,510



although AMD is better at 12k gaming, i believe nividia is better for 4k

 

mindroid

Honorable
Nov 14, 2013
10
0
10,510


Think again about 12k possibilities, this was posted 4 months ago...

http://blogs.windows.com/windows/b/extremewindows/archive/2013/07/25/pushing-the-12k-pc-gaming-boundary-at-1-5-billion-pixels-per-second.aspx

 


I don't see how that showed anything against what I said.

It took an easy to max out game at reduced settings, with 3-way crossfire to manage playable FPS.
 

You know that 4 way SLI runs FPS about the equivalent of 2 1/2 cards? I think any more than 2-way SLI is a waste of money!
CPU-wise - an i7 4930k wouldn't come anywhere near choking the GPU performance.
Before spending that sort of money, I'd do a hell of a lot of research. Not just here.
But 4-way 780 Ti's just seems like you're dreaming. Rich knuckleheads might spend that sort of money for bragging rights. But not serious gamers. When 4k monitors are readily available maybe SLI 780 would be useful.
 

mindroid

Honorable
Nov 14, 2013
10
0
10,510


thats 3 screens of 4k crossfire IS 12k gaming (11,520 x 2,160) and you can run a game at 12k with current GPU's on ultra settings and still have 50-60 fps
 


No, that wasn't what they did. They played Dirt 2 on high settings, and several years ago, it was a game easily able to hit over 100 FPS on ultra on a modest computer.

It is cool they could do it, but you can't expect to play modern titles with good FPS.