Sign in with
Sign up | Sign in
Your question
Solved

4-way sli NVIDIA GTX 780 ti

Tags:
  • Gtx
  • Nvidia
  • SLI
  • Graphics
Last response: in Graphics & Displays
Share
November 13, 2013 9:07:51 PM

Hi guys, i'm planning to build a hardcore PC rig. I'm just wondering if 1 (or 2) Xeon E5-2697 v2 will be totally enough to handle 4 NVIDIA GTX 780 ti cards in SLI mode. Thank you in advance. Hoping to see your replies :) 
P/s: budget is not my problem. So... it won't be my problem :sarcastic: 

More about : sli nvidia gtx 780

November 13, 2013 9:19:28 PM

I am a little worried about the per core performance. You might actually be better off for gaming just running a 4960X for example. The hex core clocked properly should be able to handle things just because there is a bit of a limitation on how many threads games run.
m
0
l
November 13, 2013 10:10:52 PM

rvilkman said:
I am a little worried about the per core performance. You might actually be better off for gaming just running a 4960X for example. The hex core clocked properly should be able to handle things just because there is a bit of a limitation on how many threads games run.


Ummm... actually i have thought about that. But it seems like 4960X (even overclocked) will bottleneck 4-way SLI cards.
m
0
l
Related resources
November 13, 2013 11:11:17 PM

That might be, but unless the game is able to take advantage of more than 6 threads the the xeon will just make things worse.

Getting a previous generation CPU might help get higher overclocks though, which might again help. so 3960X might be the better of the 2.
m
0
l
November 14, 2013 8:00:10 PM

the Xeon E5-2697 OR the 4960x WILL handle the 4 way sli no matter which you choose but to choose one, here is a link to compare

http://www.cpu-world.com/Compare/33/Intel_Core_i7_Extre...

ps. can you post a 3d mark score of the 4 way sli gtx 780 ti plz?

pps. you could also overkill with two of Xeon
m
0
l

Best solution

a c 84 Î Nvidia
November 14, 2013 9:44:32 PM

The Xeon is just not a gaming CPU. It is slower than an i7 and doesn't OC like the i7. Games rarely use more than 4 cores, and when they do, 6 is about the limit anyways. All the cores on the Xeon will go to waste. On top of that, games are not optimized with a Xeon in mind.

Now you also must realize that the use of that much power is tied up in the resolution you plan to play at. At 1080p, you'd be bottlenecked in pretty much any game with just 2 780ti's, but with a 4k screen, you'd rarely bottleneck an i5.

That last unknown is will the 780ti even work in 4-way SLI? Officially, Nvidia cards, other than their dual cards (like the 590 or 690) do not support 4-way SLI, though the Titan does unofficially, but many of their cards are artificially blocked from 4-way SLI.
Share
November 15, 2013 12:35:29 AM

mindroid said:
the Xeon E5-2697 OR the 4960x WILL handle the 4 way sli no matter which you choose but to choose one, here is a link to compare

http://www.cpu-world.com/Compare/33/Intel_Core_i7_Extre...

ps. can you post a 3d mark score of the 4 way sli gtx 780 ti plz?

pps. you could also overkill with two of Xeon

Umm... I'm sorry but this is just a plan :)  Even budget isn't my problem, i just don't want to waste them in the worthless places. So i just need your advice & experience to choose whatever suits me the most :)  But thanks for the help anyway :) 

m
0
l
November 15, 2013 12:52:03 AM

bystander said:
The Xeon is just not a gaming CPU. It is slower than an i7 and doesn't OC like the i7. Games rarely use more than 4 cores, and when they do, 6 is about the limit anyways. All the cores on the Xeon will go to waste. On top of that, games are not optimized with a Xeon in mind.

Now you also must realize that the use of that much power is tied up in the resolution you plan to play at. At 1080p, you'd be bottlenecked in pretty much any game with just 2 780ti's, but with a 4k screen, you'd rarely bottleneck an i5.

That last unknown is will the 780ti even work in 4-way SLI? Officially, Nvidia cards, other than their dual cards (like the 590 or 690) do not support 4-way SLI, though the Titan does unofficially, but many of their cards are artificially blocked from 4-way SLI.

Thanks for the enlightenment. But can you explain to me more about the 4-way SLI? My thought is that 4-way SLI will combine 4 cards & make them act like one GIANT, POWERFUL graphic card. Actually i have read some of the posts & they also said that 4-way SLI was not recommended. Then i thought that it might be the CPU which was bottle-necking the graphic cards. So in the end, i just thought that 2 of the most powerful CPU on Earth would solve this problem. But since you said that games weren't optimized for heavy-duty CPUs rather than games-oriented CPUs, I just don't know what will be suitable enough to squeeze the best out of the 4 cards.
You might ask the reason why i'm so obsessive about this things. That's because I want to play hardcore games at their maximum settings (what i mean is REAL maximum settings, things like the highest level of AA, and other settings maximized). Although people have been saying that those settings are just for "feeling" & experiencing, but i still want to actually use those, rather than turning them on, watching them for a moment, "feeling" them, and then turning them off.
Anyway, thanks for the post :)  Very appreciate you spending your time :) 
m
0
l
a c 84 Î Nvidia
November 15, 2013 7:32:25 AM

I do not know if 4-way SLI works with the 780ti. It does not work with the 780 and it is unofficial with the Titan. I do not know about the 780ti.

Now, you still haven't told us your resolution. Depending on your resolution, the 3rd and 4th card may not help, or barely help. Look at these benchmarks, at 1200p, 3-way SLI Titans are barely better than 2-way SLI, but at 1600p, it is a fair bit better. Now imagine adding a 4th. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_T...

Now about this: "My thought is that 4-way SLI will combine 4 cards & make them act like one GIANT, POWERFUL graphic card."

SLI does not make the cards act as 1 big card. It doesn't make any of them faster, and they do not share vram. Each card is tasked with delivering one frame on its own, and the CPU tasks the next GPU to create the next image, and then it tasks the 3rd card to create an image of its own, and so on and so forth. They do not act as a single powerful GPU. They act as a team of GPU's working independently with their own image.

In most cases a single CPU core is tasked with prepping frames for the GPU's. This is something that is linear, a difficult to multitask. This is why a super fast CPU per core is better than one that has 12+ cores. If that core cannot keep up with 4 GPUs, you are not going to gain full benefits from those GPUs.

If the GPU's take a long time to do their tasks, then the CPU will be able to keep up, but if they zip past each frame, the CPU cannot keep up and slows everything down. Your resolution determines whether or not the GPU's will be too fast for the CPU.

What is your resolution?
m
0
l
November 15, 2013 6:24:50 PM

Nivida tested the gtx 780 ti and it can handle 4 way sli WAYYY better than the gtx titan. Also I did more research and I found out the the GPU will not effect CPU bottlenecking, in fact having a better GPU releives the CPU's task of rendering. The CPU is not just handing the GPU things to process, they are working together and the GPU is doing most of the work. I will help choose what CPU you should get have.
If you are using the rig to game with, it is better to go with the 4960x since all pc games cannot harrass 12 cores of an Xeon. Ghz is more important at this point since it makes the processor faster.
If you are using it for BETTER 3d rendering and go with the 12 cores of the Xeon

dont hate me but i kinda used wikipedia
http://en.wikipedia.org/wiki/Rendering_%28computer_grap...

m
0
l
a c 84 Î Nvidia
November 15, 2013 7:12:41 PM

mindroid said:
Nivida tested the gtx 780 ti and it can handle 4 way sli WAYYY better than the gtx titan. Also I did more research and I found out the the GPU will not effect CPU bottlenecking, in fact having a better GPU releives the CPU's task of rendering. The CPU is not just handing the GPU things to process, they are working together and the GPU is doing most of the work. I will help choose what CPU you should get have.
If you are using the rig to game with, it is better to go with the 4960x since all pc games cannot harrass 12 cores of an Xeon. Ghz is more important at this point since it makes the processor faster.
If you are using it for BETTER 3d rendering and go with the 12 cores of the Xeon

dont hate me but i kinda used wikipedia
http://en.wikipedia.org/wiki/Rendering_%28computer_grap...


I don't know about the 780ti ability to 4-way SLI, so I will take your word about it, though a link would be better.

You are, however, quite wrong about the CPU not having an issue with bottlenecking. Though it may not depending on the resolution he is planning on.

While the CPU and GPU work together, they have defined tasks. The CPU uses draw calls to setup what to be rendered, as well as handle physics and AI. The GPU renders images on the screen based on the draw calls the CPU presents it. If the CPU cannot setup what to be rendered as fast as the GPU's render them, it will cause the GPU's to have to wait. This is known as a bottleneck.

Go through the benchmarks on the link I gave above, you'll find several games where 3 Titans do not deliver more FPS than 2 at lower resolutions and even at higher resolutions on some. This is an example of a bottleneck. When their is a bottleneck, more GPU's can hurt performance due to requiring extra overhead on the CPU, the thing causing a bottleneck.

Examples:






There are many more in the review here: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_T...
m
0
l
November 15, 2013 8:03:34 PM

Thank you, bystander,
I was trying to come at it a postitive way, but yes you are right the GPU can potentially bottleneck the CPU, i think thats what happened with the charts posted cause its a Core i7-3820, kinda old, and more for price, current high end CPUs, will EVENTUALLY bottleneck when better graphics cards come out, but for the time being, they are going to help with 4 gtx 780 ti's
With past games, more sli is worse since just two cards aready brought the game to max, and since the game makers didnt make the graphics good enough to run better than 2 gtx titans the third card weighed the processing down.
By the way, rexusforte, if you just want to run on a single screen, 4 gtx 780 ti's is overkill and the charts above might be the case for you, go for triple screen (7680 by 1600) or 4k. The i7 4960x is good, unless you still would like to use a Xeon.

m
0
l
a c 84 Î Nvidia
November 15, 2013 8:16:10 PM

mindroid said:
Thank you, bystander,
I was trying to come at it a postitive way, but yes you are right the GPU can potentially bottleneck the CPU, i think thats what happened with the charts posted cause its a Core i7-3820, kinda old, and more for price, current high end CPUs, will EVENTUALLY bottleneck when better graphics cards come out, but for the time being, they are going to help with 4 gtx 780 ti's
With past games, more sli is worse since just two cards aready brought the game to max, and since the game makers didnt make the graphics good enough to run better than 2 gtx titans the third card weighed the processing down.
By the way, rexusforte, if you just want to run on a single screen, 4 gtx 780 ti's is overkill and the charts above might be the case for you, go for triple screen (7680 by 1600) or 4k. The i7 4960x is good, unless you still would like to use a Xeon.



The 780ti's are about as much faster as the Titan as the 4960x is to the 3820 and this is just with 2-way to 3-way Titan. Imagine how often 4-way will cause a bottleneck.

Of course if he plans on a 4k resolution, it will be less often. It all boils down to the resolution he plans to play at. Even 5760x1080p will often not benefit from the 4th 780ti, unless that is in 3D Vision as well.
m
0
l
November 17, 2013 2:06:17 AM

Thank you guys for your advice & your time. If then perhaps i'll wait for the Haswell-E or Broadwell-E series to buy. But still, until that moment, NVIDIA may have been released some kinds of monstrous graphics cards. I guess that what i want is not possible (or at least at this moment). Again, very appreciate your advice & your time guys :) 

P/s: oh i forgot to mention. I would really really like to play games in 4k resolution. Even if 4k monitors are only limited at 60fps, but G-sync technology of NVIDIA has solved this problem. Tripple 4k monitors would be nice for me :) 
m
0
l
November 17, 2013 11:33:25 PM

I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.

In a pointless attempt to protect the revenues of their very expensive professional Quardro GPUs, NVIDIA artifically cripples their Windows (assuming this is your OS) driver to prevent users from using features like Surround 2x1, 2x2 configurations, 10-bit color and multiple display stereoscopic 3D in OpenGL (Quad Buffer Stereo). All Geforce cards are capable of these features and they are accessible when using the Linux GeForce driver but they are artificially disabled in the Windows driver.

Due to silicon limitations, this display requires DisplayPort MST to operate at 4K @ 60 Hz. The display actually appears as two tiled 1920x2160 monitors which is why this monitor is capable of doing 4K @ 60 Hz over 2 HDMI cables.

With the introduction of this monitor, NVIDIA was left with a choice, either support Surround 2x1, 2x2 configurations properly so anyone with any pair of monitors could play a 3D game with any variety of 2 or 4 monitors or they could write some sort of hack in the driver to support these types of monitors specifically while avoiding giving Windows users 2x1 and 2x2 Surround support.

NVIDIA is the lone GPU maker without 2x1 and 2x2 monitor configuration support. AMD has supported it forever with Eyefinity and now even Intel supports these configurations with their integrated GPUs using their Collage feature.

So you can probably guess what NVIDIA decided to do, instead of supporting Surround 2x1 properly, they decided to hack their drivers. They created an EDID white-list so they could detect these kinds of monitors and support their unique 2x1 capability while still disabling general Surround 2x1 support with any pair of monitors.

NVIDIA had an pre-production version of this display and updated their driver based on that. However when Sharp finally shipped this monitor, they changed the EDID data from the pre-production display. This change caused the display to fail the NVIDIA EDID whitelist check and not allow it to operate at 4K @ 60 Hz. So now NVIDIA is in the process of adding the correct EDID data to the white-list in the driver and soon the monitor will finally work with NVIDIA GPUs.

NVIDIA could have avoided all this by just giving everyone proper 2x1 and 2x2 surround support.

This comment is a summary of the following massive 9 page thread with comments directly from NVIDIA confirming the above:

https://forums.geforce.com/default/topic/539645/nvidia-...

So, I said it, sorry to ruin your dreams, but its fact and i can't fight that :/ 
UEFI Bios's and NVidia don't play well, together, here is a possible fix.
OK, I just ran across the fix for this while doing research on something else. I am assuming you have a UEFI Mobo, with Secureboot. If I am right, try this out. It should work: ... See full content
m
0
l
November 18, 2013 2:21:02 AM

mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.

In a pointless attempt to protect the revenues of their very expensive professional Quardro GPUs, NVIDIA artifically cripples their Windows (assuming this is your OS) driver to prevent users from using features like Surround 2x1, 2x2 configurations, 10-bit color and multiple display stereoscopic 3D in OpenGL (Quad Buffer Stereo). All Geforce cards are capable of these features and they are accessible when using the Linux GeForce driver but they are artificially disabled in the Windows driver.

Due to silicon limitations, this display requires DisplayPort MST to operate at 4K @ 60 Hz. The display actually appears as two tiled 1920x2160 monitors which is why this monitor is capable of doing 4K @ 60 Hz over 2 HDMI cables.

With the introduction of this monitor, NVIDIA was left with a choice, either support Surround 2x1, 2x2 configurations properly so anyone with any pair of monitors could play a 3D game with any variety of 2 or 4 monitors or they could write some sort of hack in the driver to support these types of monitors specifically while avoiding giving Windows users 2x1 and 2x2 Surround support.

NVIDIA is the lone GPU maker without 2x1 and 2x2 monitor configuration support. AMD has supported it forever with Eyefinity and now even Intel supports these configurations with their integrated GPUs using their Collage feature.

So you can probably guess what NVIDIA decided to do, instead of supporting Surround 2x1 properly, they decided to hack their drivers. They created an EDID white-list so they could detect these kinds of monitors and support their unique 2x1 capability while still disabling general Surround 2x1 support with any pair of monitors.

NVIDIA had an pre-production version of this display and updated their driver based on that. However when Sharp finally shipped this monitor, they changed the EDID data from the pre-production display. This change caused the display to fail the NVIDIA EDID whitelist check and not allow it to operate at 4K @ 60 Hz. So now NVIDIA is in the process of adding the correct EDID data to the white-list in the driver and soon the monitor will finally work with NVIDIA GPUs.

NVIDIA could have avoided all this by just giving everyone proper 2x1 and 2x2 surround support.

This comment is a summary of the following massive 9 page thread with comments directly from NVIDIA confirming the above:

https://forums.geforce.com/default/topic/539645/nvidia-...

So, I said it, sorry to ruin your dreams, but its fact and i can't fight that :/ 

Thanks a lot for your knowledge :D  Well either NVIDIA or AMD is fine with me due to the fact is that i just want to play games :)  I said that i would really like to play games in 4K resolution, but if that's not possible, then 2k, or even 1080p will still be fine with me :) 
m
0
l
a c 84 Î Nvidia
November 18, 2013 7:36:08 AM

mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.


That is an old thread. Nvidia does support 4K now.
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-...

Notice all the 780ti reviews that are done with 4k? However, if you are serious about 4k Eyefinity across 3 screens, then AMD is the only choice, as Nvidia does not support 6 screen surround. 4k monitors are really 2 monitors in 1, so 3 of them is like having 6 monitors.

Of course 3 4k monitors would be awful for gaming, as it is just too much size to run with the power of current cards.
m
0
l
November 18, 2013 9:16:24 PM

rexusforte said:
mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.

In a pointless attempt to protect the revenues of their very expensive professional Quardro GPUs, NVIDIA artifically cripples their Windows (assuming this is your OS) driver to prevent users from using features like Surround 2x1, 2x2 configurations, 10-bit color and multiple display stereoscopic 3D in OpenGL (Quad Buffer Stereo). All Geforce cards are capable of these features and they are accessible when using the Linux GeForce driver but they are artificially disabled in the Windows driver.

Due to silicon limitations, this display requires DisplayPort MST to operate at 4K @ 60 Hz. The display actually appears as two tiled 1920x2160 monitors which is why this monitor is capable of doing 4K @ 60 Hz over 2 HDMI cables.

With the introduction of this monitor, NVIDIA was left with a choice, either support Surround 2x1, 2x2 configurations properly so anyone with any pair of monitors could play a 3D game with any variety of 2 or 4 monitors or they could write some sort of hack in the driver to support these types of monitors specifically while avoiding giving Windows users 2x1 and 2x2 Surround support.

NVIDIA is the lone GPU maker without 2x1 and 2x2 monitor configuration support. AMD has supported it forever with Eyefinity and now even Intel supports these configurations with their integrated GPUs using their Collage feature.

So you can probably guess what NVIDIA decided to do, instead of supporting Surround 2x1 properly, they decided to hack their drivers. They created an EDID white-list so they could detect these kinds of monitors and support their unique 2x1 capability while still disabling general Surround 2x1 support with any pair of monitors.

NVIDIA had an pre-production version of this display and updated their driver based on that. However when Sharp finally shipped this monitor, they changed the EDID data from the pre-production display. This change caused the display to fail the NVIDIA EDID whitelist check and not allow it to operate at 4K @ 60 Hz. So now NVIDIA is in the process of adding the correct EDID data to the white-list in the driver and soon the monitor will finally work with NVIDIA GPUs.

NVIDIA could have avoided all this by just giving everyone proper 2x1 and 2x2 surround support.

This comment is a summary of the following massive 9 page thread with comments directly from NVIDIA confirming the above:

https://forums.geforce.com/default/topic/539645/nvidia-...

So, I said it, sorry to ruin your dreams, but its fact and i can't fight that :/ 

Thanks a lot for your knowledge :D  Well either NVIDIA or AMD is fine with me due to the fact is that i just want to play games :)  I said that i would really like to play games in 4K resolution, but if that's not possible, then 2k, or even 1080p will still be fine with me :) 



although AMD is better at 12k gaming, i believe nividia is better for 4k

m
0
l
November 18, 2013 9:19:26 PM

bystander said:
mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.


That is an old thread. Nvidia does support 4K now.
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-...

Notice all the 780ti reviews that are done with 4k? However, if you are serious about 4k Eyefinity across 3 screens, then AMD is the only choice, as Nvidia does not support 6 screen surround. 4k monitors are really 2 monitors in 1, so 3 of them is like having 6 monitors.

Of course 3 4k monitors would be awful for gaming, as it is just too much size to run with the power of current cards.


Think again about 12k possibilities, this was posted 4 months ago...

http://blogs.windows.com/windows/b/extremewindows/archi...

m
0
l
November 18, 2013 9:26:41 PM

sorry i accidentally added anouther answer
m
0
l
a c 84 Î Nvidia
November 18, 2013 10:20:31 PM

mindroid said:
bystander said:
mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.


That is an old thread. Nvidia does support 4K now.
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-...

Notice all the 780ti reviews that are done with 4k? However, if you are serious about 4k Eyefinity across 3 screens, then AMD is the only choice, as Nvidia does not support 6 screen surround. 4k monitors are really 2 monitors in 1, so 3 of them is like having 6 monitors.

Of course 3 4k monitors would be awful for gaming, as it is just too much size to run with the power of current cards.


Think again about 12k possibilities, this was posted 4 months ago...

http://blogs.windows.com/windows/b/extremewindows/archi...



I don't see how that showed anything against what I said.

It took an easy to max out game at reduced settings, with 3-way crossfire to manage playable FPS.
m
0
l
a b Î Nvidia
November 20, 2013 1:21:26 PM

rexusforte said:
Hi guys, i'm planning to build a hardcore PC rig. I'm just wondering if 1 (or 2) Xeon E5-2697 v2 will be totally enough to handle 4 NVIDIA GTX 780 ti cards in SLI mode. Thank you in advance. Hoping to see your replies :) 
P/s: budget is not my problem. So... it won't be my problem :sarcastic: 

You know that 4 way SLI runs FPS about the equivalent of 2 1/2 cards? I think any more than 2-way SLI is a waste of money!
CPU-wise - an i7 4930k wouldn't come anywhere near choking the GPU performance.
Before spending that sort of money, I'd do a hell of a lot of research. Not just here.
But 4-way 780 Ti's just seems like you're dreaming. Rich knuckleheads might spend that sort of money for bragging rights. But not serious gamers. When 4k monitors are readily available maybe SLI 780 would be useful.
m
0
l
November 20, 2013 5:34:55 PM

bystander said:
mindroid said:
bystander said:
mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.


That is an old thread. Nvidia does support 4K now.
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-...

Notice all the 780ti reviews that are done with 4k? However, if you are serious about 4k Eyefinity across 3 screens, then AMD is the only choice, as Nvidia does not support 6 screen surround. 4k monitors are really 2 monitors in 1, so 3 of them is like having 6 monitors.

Of course 3 4k monitors would be awful for gaming, as it is just too much size to run with the power of current cards.


Think again about 12k possibilities, this was posted 4 months ago...

http://blogs.windows.com/windows/b/extremewindows/archi...



I don't see how that showed anything against what I said.

It took an easy to max out game at reduced settings, with 3-way crossfire to manage playable FPS.


thats 3 screens of 4k crossfire IS 12k gaming (11,520 x 2,160) and you can run a game at 12k with current GPU's on ultra settings and still have 50-60 fps
m
0
l
a c 84 Î Nvidia
November 20, 2013 5:41:59 PM

i7Baby said:
thats 3 screens of 4k crossfire IS 12k gaming (11,520 x 2,160) and you can run a game at 12k with current GPU's on ultra settings and still have 50-60 fps


No, that wasn't what they did. They played Dirt 2 on high settings, and several years ago, it was a game easily able to hit over 100 FPS on ultra on a modest computer.

It is cool they could do it, but you can't expect to play modern titles with good FPS.

m
0
l
December 3, 2013 11:30:33 AM

rexusforte said:
Hi guys, i'm planning to build a hardcore PC rig. I'm just wondering if 1 (or 2) Xeon E5-2697 v2 will be totally enough to handle 4 NVIDIA GTX 780 ti cards in SLI mode. Thank you in advance. Hoping to see your replies :) 
P/s: budget is not my problem. So... it won't be my problem :sarcastic: 


i will go with 3 780ti but remember you still need a powerful cpu and a motherboard thats supports that and plus 4 way sli like come on you might as well wait for the new gtx 790 or the titan ultra. i cant wait to get the titan ultra i hope its got like 6gb-8gb
m
0
l
December 4, 2013 6:43:52 AM

If your going to spend that much money just wait a few more months for the new 790 or titan ultra. Either that or give me your money my fancy pants. ;) 
m
0
l
December 31, 2013 6:43:16 AM

mindroid said:
I'm going to hate saying this because im a HUGE Nivida fan, but if your going 12k gaming (3, 4k monitoirs or 11,520 x 2,160 pixels) it is better to use AMD GPU's.

The reason why they are using AMD GPUs instead of NVIDIA GPUs is simple, at the time of writing this post, the PN-K321 (best 4k screen in the currently in the market) does not work with NVIDIA GPUs at 4K @ 60 Hz due to NVIDIA's artificially crippled driver.

In a pointless attempt to protect the revenues of their very expensive professional Quardro GPUs, NVIDIA artifically cripples their Windows (assuming this is your OS) driver to prevent users from using features like Surround 2x1, 2x2 configurations, 10-bit color and multiple display stereoscopic 3D in OpenGL (Quad Buffer Stereo). All Geforce cards are capable of these features and they are accessible when using the Linux GeForce driver but they are artificially disabled in the Windows driver.

Due to silicon limitations, this display requires DisplayPort MST to operate at 4K @ 60 Hz. The display actually appears as two tiled 1920x2160 monitors which is why this monitor is capable of doing 4K @ 60 Hz over 2 HDMI cables.

With the introduction of this monitor, NVIDIA was left with a choice, either support Surround 2x1, 2x2 configurations properly so anyone with any pair of monitors could play a 3D game with any variety of 2 or 4 monitors or they could write some sort of hack in the driver to support these types of monitors specifically while avoiding giving Windows users 2x1 and 2x2 Surround support.

NVIDIA is the lone GPU maker without 2x1 and 2x2 monitor configuration support. AMD has supported it forever with Eyefinity and now even Intel supports these configurations with their integrated GPUs using their Collage feature.

So you can probably guess what NVIDIA decided to do, instead of supporting Surround 2x1 properly, they decided to hack their drivers. They created an EDID white-list so they could detect these kinds of monitors and support their unique 2x1 capability while still disabling general Surround 2x1 support with any pair of monitors.

NVIDIA had an pre-production version of this display and updated their driver based on that. However when Sharp finally shipped this monitor, they changed the EDID data from the pre-production display. This change caused the display to fail the NVIDIA EDID whitelist check and not allow it to operate at 4K @ 60 Hz. So now NVIDIA is in the process of adding the correct EDID data to the white-list in the driver and soon the monitor will finally work with NVIDIA GPUs.

NVIDIA could have avoided all this by just giving everyone proper 2x1 and 2x2 surround support.

This comment is a summary of the following massive 9 page thread with comments directly from NVIDIA confirming the above:

https://forums.geforce.com/default/topic/539645/nvidia-...

So, I said it, sorry to ruin your dreams, but its fact and i can't fight that :/ 


if your going 3x4k monitors A) your rich and B) wait because no gpu's out can handle that. Also with that many pixels your ONLY option would be titan. You would run out of vram on many many more games than otherwise. I game at 4680x2560 which is >4k and most games struggle to hold 60fps on max with 4 titans. That's real talk from someone who actually HAS the hardware theyre talking about. (and this is overclocked to the gills)

try running 3x that resolution and you get the picture. Impossibility with todays hardware. Sorry.
m
0
l
a b Î Nvidia
December 31, 2013 8:56:13 AM

Maybe, he could do with the quadro's? I think the k6000 has 12GB of RAM, but it's not that powerful to be able to hold 60 fps at 3x4k.
m
0
l
January 13, 2014 9:48:38 AM

bystander said:
The Xeon is just not a gaming CPU. It is slower than an i7 and doesn't OC like the i7. Games rarely use more than 4 cores, and when they do, 6 is about the limit anyways. All the cores on the Xeon will go to waste. On top of that, games are not optimized with a Xeon in mind.

Now you also must realize that the use of that much power is tied up in the resolution you plan to play at. At 1080p, you'd be bottlenecked in pretty much any game with just 2 780ti's, but with a 4k screen, you'd rarely bottleneck an i5.

That last unknown is will the 780ti even work in 4-way SLI? Officially, Nvidia cards, other than their dual cards (like the 590 or 690) do not support 4-way SLI, though the Titan does unofficially, but many of their cards are artificially blocked from 4-way SLI.


Ok now this is going to sound mad - how about a 4k 3d surround. I have that in mind, that is when I have 4 780 ti cards in my mind.

Anyways, I currenty run a surround 1080 on 2 780s, with and 4770k, no bottlenecks whatsoever in my experiences , so even a 4930x would be a bit overkill for that of 2 780s, maybe will hit 30% load max? I dont know, but maybe if you are as rich as hell, and planning to swap to the newest cards on release, and cant be bothered to swap mobo and cpu, then maybe 4970x, so please (to the original question-guy) consider where your sepending your hard earned money.

Also here is my build now.
-4770k
-16 GB Corsair Vengeance Pro
-2x 780 SLI (ASUS)
-Asus maximus vi formula
-Corsair ax1200i
-Corsair 900d
-1x Evo 1TB
-2x 4tb black drives (no raid)
-1x Blu-Ray Drive
-Corsair H100i

And a little tips on custom watercooling that would help a tonne, thanks.


m
0
l
a c 84 Î Nvidia
January 13, 2014 10:34:24 AM

h20deli said:
Ok now this is going to sound mad - how about a 4k 3d surround. I have that in mind, that is when I have 4 780 ti cards in my mind.


I do not believe there are any 3D monitors beyond 1080p. There are a couple 1440p 120hz TN displays with the necessary speeds to handle 3D coming this spring, but apparently they aren't 3D Vision capable.

If you meant 2D surround, then that may be possible now, but only if it isn't one of these monitors that handles 4k by putting 2 monitors into a single frame.

m
0
l
January 13, 2014 1:16:16 PM

bystander said:
h20deli said:
Ok now this is going to sound mad - how about a 4k 3d surround. I have that in mind, that is when I have 4 780 ti cards in my mind.


I do not believe there are any 3D monitors beyond 1080p. There are a couple 1440p 120hz TN displays with the necessary speeds to handle 3D coming this spring, but apparently they aren't 3D Vision capable.

If you meant 2D surround, then that may be possible now, but only if it isn't one of these monitors that handles 4k by putting 2 monitors into a single frame.


Yes that is true, but the idea of over-spending is basically trying to future-proof your next system, so that you dont have to keep buying parts on the way. So basically when 4k 3d Displays come out, then you can probably plug and play (probs via display port) if dp cant handle the data stream, then a tri 4k display should be able to start kicking the system over 60% load
m
0
l
May 12, 2014 9:47:33 PM

rexusforte said:
rvilkman said:
I am a little worried about the per core performance. You might actually be better off for gaming just running a 4960X for example. The hex core clocked properly should be able to handle things just because there is a bit of a limitation on how many threads games run.


Ummm... actually i have thought about that. But it seems like 4960X (even overclocked) will bottleneck 4-way SLI cards.


The processing capability of the best CPU is dwarfed by the massive processing power of the GPU. A extra CPU will significantly inceease the cost and hardly make any difference in porcessing ability. To optimize a system for processing or gaming concetrate on a good motherboard with fast buses and a fast set of GPUs.

Example, my 2011 custom built PC with extremely fast multiple memory channels clearly beats my 2014 low end mother board PC build. And if its raw processing power you want multiple GPUs each with their own memory is better than a single GPU of approx equiv power, but if its for gaming a single GPU with the same power as multiple GPUs in SLI is faster.

My 4 core i7 beats my 6 core extreme. Games are not optimized for many CPU cores. So don't waste money on multiple CPUs or extra cores for the sake of gaiming. For scientific apps designed for it, its different.

For the fastest system for the buck concentrate on the fastest single GPU, fast multiple channel memory, fast PCIe buses motherboard.

m
1
l
a b Î Nvidia
May 12, 2014 11:33:59 PM

Please don't bump posts.
m
0
l
!