Will 1080 SLI be worth the hassle?

DomDom97

Honorable
Jan 1, 2014
82
0
10,660
Hey

I'm wanting to upgrade to a GTX 1080, however, I'm not sure whether to get two cards or one. I know 1 card is already very powerful but I'm the kind of person who would rather have both high FPS and all the eye-candy. From the few benchmarks available for 1080 SLI it seems like they scale quite well however what's the real story with SLI? I have heard bad things about it. Is it really that bad?

Thanks for any replies and thoughts :)
 

DomDom97

Honorable
Jan 1, 2014
82
0
10,660

Have ultrawide 2560*1080 now but would like to make that multimedia monitor and get a 2k 144Hz GSync (when wallet allows all that). Also, a VR headset (probably the second Gen when it comes out) which will be at least 2k if not more (I'm guessing).
 
"only".

Anyways, you want to wait until more cards come out, so that prices make sense. AMD are known to force NVIDIA to lower the prices, so you might as well wait, it might happen again. Big events going on right now, might hear an announcement, considering there's no way they're going to let NVIDIA take 100% of the market on the enthusiast side.
 

DomDom97

Honorable
Jan 1, 2014
82
0
10,660


I was thinking something similar. With Polaris being announced tomorrow, it should be interesting to watch the prices....
Also, I will be going to Taiwan end of July and I will be checking for prices then. There's almost a guarantee that it will be cheaper there than here in New Zealand. Here in NZ, we get a 300 NZD bump in price just because New Zealand. -_- Anyway that means I can't wait too long or I'll be paying another big chunk. I hope the prices drop until then.
 


It will help, but at the end of the day, SLI and CF are limited by bandwidth. You're always going to have latency problems because of communication between the cards.

A single 1080 is pretty much overkill for now.
 

DomDom97

Honorable
Jan 1, 2014
82
0
10,660


Probably true. Scary to think that we are at the point that the performance is almost limited by the CPU rather than the single GPU...
 


GPUs can still get 40% gains by adding more transistors, due to what they do being almost entirely parallel. That being said, there's one, maybe two more die shrinks before that goes away as well. GPUs are nearing the end of their ability to significant improve performance through simply adding more transistors. In three-four years, when the 10nm GPUs come out, we'll be basically EOL for performance improvements.

In short: we're nearing peak gaming performance.
 

DomDom97

Honorable
Jan 1, 2014
82
0
10,660


Will be interesting to see what they do after that? If we cant make a single one better do we just add more chips? 30 cm boeard full of gpus voodoo style anyone? :p
 


You run into major latency issues when introducing multiple cards like that though; thats the main reason all multi-card configurations use Alternate Frame Rendering, rather then allow two or more cards to co-process a single frame.

In theory, optical chips could be clocked insanely high, but the main downside is to generate the light beam costs a LOT of power, making them non-usable in most environments.

DARPA is spending a TON on finding ways to shrink Vacuum Tubes down to usable sizes. They have a prototype Vacuum Tube based CPU running at about 400 GHz or so. If they can shrink down the tech and fix those pesky reliability issues, you could actually see CPUs revert back to this tech simply because you can clock them to near infinity. DARPA's also looking into using relays to replace transistors, simply due to the inherent power savings involved.

And of course, Quantum Computing is always on the horizon, but I foresee the issue being software developers like me simply having no idea how to actually write quantum software :/