GTX 1080 ..... is SLI useless at 1080p?

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785
Normally, when building or upgrading a gaming rig, I try to get the best single card I can afford, than about a year or so later buy a second one used cheap to SLI/Crossfire it so I can keep the system going for a few years more until the next single card can outperform that SLI setup.

Currently I have two GTX 670s in SLI, and am looking to upgrade. The 900 series cards were close, but barely beat out my SLI setup, unless you were looking at TITAN cards or (at the time) $800 Ti versions.

Needless to say, the 1080 obliterates it, and I am considering an upgrade.

Thing is, my monitor is 1080p, and I am not really interested in upgrading to 4k, namely because it is a 144hz monitor that supports Nvidia 3D Vision (Yes, I actually use the 3D feature), and pushing 4k at 120FPS would be a challenge even for two 1080 cards..... not to mention I don't think any monitors even supports 120+hz and 3d Vision at 4k... and if they do they would cost a fortune right now.

However..... I looked at benchmarks of GTS 1080 cards in SLI and the results were..... disheartening.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/

They performed well for 4K, practically all the games they tested saw a clear FPS in SLI at 4K.... but it's the 1080p results that worry me. The majority of the games saw practically no boost at all, or worse, a few of them actually ran SLOWER than a single card would!

Granted, this benchmark was made a while ago, so that's why I am asking here. Has this situation improved at all? Was it some driver or lack of support issue? Or do two GTX1080s still barely bring any performance increase at 1080p?

Admittedly, I didn't realize games needed to have an SLI profile otherwise they would run on a single card. This wasn't really an issue back when the 670 were new as SLI profiles were being updated for pretty much all major games all the time in Nvidia's drivers, but lately they seem to be putting far far less efforts into the whole SLI thing and support. (Strange, you would think Nvidia would WANT people to buy more of their hardware).

As I said though, I am not going to just buy two 1080s right off the bat, I am going to replace my dual670s with a single 1080, and if it's not a pointless waste to get a second 1080, add one in later, and stick with that setup for a few years instead of upgrading to the 1180 when it comes out. I don't upgrade my cards every time a new one is out.
 
Solution

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785
Even for a few years down the line? As I said, I am still on GTX 670 cards, I wouldn't have been able to keep them going for this long on a single card.

Mostly, I wanted to know if there was pretty much zero performance gain, or even a performance drop, if I went SLI in a few months or about a year or so from now.
 

MyNewRig

Honorable
May 7, 2013
278
0
10,860


Having two 1080s in SLI and a 1080p is not the wisest plan, a better upgrade path is:

Get only ONE 1080 or even one 1070 for your 1080p monitor, then > your next upgrade would be a better monitor, 1440p or 4K and then > next you sell the 1080 and get the next best thing at that time, probably next year's Volta card or a 1080Ti

Why would you get two 1080s to keep your system running on them for a few years when you can upgrade GPU every year or two by selling what you have and getting the next best thing you can afford? both solutions would cost you the same at the end but one is better than the other.

1080 SLI only if you have a 4K monitor NOW and want the maximum performance you can get out of it NOW, but when talking about the future it is always better to sell what you have and upgrade to what you need also in the future when you you actually need it :)
 

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785


4K Monitors cost an arm and a leg for anything decent, and that's at 60hz. Plus, very very few support Nvidia 3D Vision, if at all, which I do use. Plus, my monitor isn't that big, I don't have room for a larger one, so insanely high resolutions would be a waste.

I have no interest in getting a GTX1080, and then upgrading that to a GTX1080Ti just a few months later, that's a pretty big waste of money, might as well just wait to get a Ti in that case, but the Ti is just too expensive.



No, it's actually quite a lot cheaper. Again, until the 1000 series cards came out the two 670s I have kept me going until now, past the 700 and 900 series. The 670 cost me $400 when I got it back when those were new, a once the 700 series was out a second 670 cost me $200 used on ebay, that's a total of $600. The GTX 770 launched at $400, buying a 670, then a 770 would have cost me $800, and the two 670s in SLI out-perform a single 770.

But the 1080 seems to not offer ANYTHING for 1080 (though I am not sure if that's just a case of games right now, or will continue in the future) and Nvidia themselves seem to care less about SLI nowadays than they did before.
 

MyNewRig

Honorable
May 7, 2013
278
0
10,860
Well, if you have a 1080p 144hz and want to run everything on ultra with DSR or x8 MSAA then a GTX 1080 might be useful, 4K does not cost all that much nowadays (especially when it comes to TVs) and 1440p monitors are reasonably priced, but regardless, if you think going SLI is your thing then you can always add another 1080 even one or two years down the road, nothing stopping you from doing that.

If you can afford the 1080 and tend to keep cards all that long then i do not see why not, you can always have the the best visuals you can get with the existence of DSR, you could even get a 1070 and SLI it later, that would be cheaper and should also be a great setup for 1080p.

To me though 4K is amazing and is totally worth it, since i started gaming in 4K i could never go back to 1080p to be honest, i just feel that the picture is not sharp enough in 1080p after i got used to gaming in 4K, just something for you to consider.
 
Why not doing the same as you did before?
Get a GTX1070
And once the card doesn't perform well enough for.you anymore, get a second one.
If that system worked for you before, why not copy it again.
A 1080 for 1080p is not really necessary, the 1070 will be slightly overkill as of now.
 

IDProG

Distinguished
+Cyber_Akuma I personally love 4K TV (not monitor). Here are the reasons:
1) You can upscale most contents and get great scaling (except 480p and 1440p as far as I know), while 1440p upscales the most common content (1080p) poorly.
2) It's big, which means it's great for watching contents.
3) With big screen size and resolution comes advanced multitasking.
4) If you say that 4K games are so hard to run, standard 1440p @ 144Hz and ultrawide 1440p @ 100Hz ones are even harder.
5) If you demand more than 60Hz, you can buy a 4K TV that supports 1080p @ 120Hz and you'll get close to 4K quality (thanks to 4K upscaling) and at 120Hz.
6) a 4K TV can be set to 21:9 aspect ratio (if you don't mind black bars, which I don't) and get a resolution that's a little bit more than the ultrawide 1440p monitor.
The ONLY drawback that I know is the input lag, which I don't really care because according to www.displaylag.com, the input lag is still great.
 

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785


That's what I want to do, I am asking if I even can do that anymore. According to that benchmark, there was practically no performance gain at 1080p when they had a single GTX1080 or two of them in SLI when running at 1080p resolution, a few games even ran worse.

I wanted to know if that was some bug with bad drivers or something, or if it actually performs like that still.

And I heard rumors that Nvidia doesn't care much to support SLI anymore, is there any truth to that?
 
Because the 1080 has too much power for a 1080p display as it is
If you can't finish 1 750g steak, ordering two won't bring any improvement

Therefore I suggest you go for a 1070, which, as it stands, can't really be maxed out @1080p
So going SLI now doesn't make much of a difference. But this might change with games yet to be released.
If 2x1080 is worth it...I doubt it. I'd get a 1070 and pick up another if it's necessary at some point.
 

MyNewRig

Honorable
May 7, 2013
278
0
10,860


I second that suggestion, best plan for your situation.
 

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785
I was considering going 1070, but since I will be stuck with it for a few years I figured it would be best to go 1080 instead. Again, I am shooting for 120fps on max settings minimum, not 60fps.
 

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785
Hmm...... someone mentioned the Asus PG278Q monitor, now I am not sure if I want to upgrade. It would be nice to have a monitor that supports 3D Vision 2 and G-Sync (though it can't do both at the same time), but since it's WQHD, according to that benchmark I would need two 1080s if I were to get 120 fps in some of the newer games at 2560x1440.... a few games seem to even have trouble at 60FPS at that resolution with a single 1080.... but two gtx1080s AND that monitor are.... very very expensive.

Not sure if it's worth it considering that 4k monitors would likely drop in price in the next few years, but then again, a 4k 120 or 144hz monitor that supports 3D vision and gsync would likely still be even more expensive even when the price drops...

(Plus there is the fact that my current monitor is 24 inches while that one is 27, so I am not sure if it would even fit. Would have been nice to have HDMI too so I can connect other devices to it but eh, I am just going to be using my PC on it 99.99% of the time anyway..... hmm, I am guessing trying to find a used one on ebay would be a bad idea?)
 

escanthon

Honorable
Jun 29, 2013
76
0
10,660
I'm a budget gamer, so I don't have much room to talk. Feel free to ignore me. You clearly want a dual card setup at some point, when the prices drop. Right now, a single 1080 would do what you need, 120fps ultra no problem at 1080p. Now, there's no telling what the future holds right? Crazy polygon counts, whole new ways of rendering, etc.. Maybe three years from now a new game will come out that your 1080 can only manage 90fps on. Most of us would never notice that drop, but since you said 120fps minimum, you might.

As for scaling, I believe it'll get better. Eventually the new multi card standards in directx 12 will fill in sli's gaps, I'm sure. It's impossible to say for sure, nobody knows what's going to happen. I've never even seen a card that performs on that level, so I especially have no ground to stand on, but the emerging technologies and API's look promising to me, especially for multi card computers.

Just my two cents, thank you for your time.
 

Karadjgne

Titan
Ambassador
You are kinda playing hopscotch with timelines. Back when the 670 was new, there was plenty of room left for 1080p to grow, and it has, quite a bit. So a 670 then, and a second later makes much sense for 1080p. Now, however, modern day 1080p is equitable to the 720p of back then, and today's 1440p is the new 1080p. 1080p is about packed. It's really not got any room left for expansion or growth.
Now while you are considering a gtx1080, that's going to do substantialy better in 1080p than sli 670. But thats 1080p. And you just maxed it out. Doubt there will ever really be any use in 1080p for 1080 sli, other than something like heavily modded gta:V. The best usage for sli 1080s will be 1440p/144. Anything lower and you just wasted money on the second card. No real difference to you running those 670s sli on a CRT.
 


Given your posts, just get 1 GTX 1080 for now. A 1070 would get the job done, but because these new cards just released, new games will start to take advantage of them and you'll eventually wish you had the faster card. Enjoy the single card experience. Games will be smoother, and quieter.

You can see clearly that SLI works just fine with the 1080's at 4K, which means that at 1080p, you are seeing bottlenecks. It's not because 1080p can't handle 2 GTX 1080's, it's because your CPU can't handle the FPS 2 GTX 1080's can achieve (please don't take this as a challenge to go and get a faster CPU, there is only so much CPU's can do today). When the dev's start adding higher end IQ settings, the 1080's will be put to use at lower resolutions again.

Now to the next big issue, the desire of a minimum 120 FPS. It can't be done in many if not most games. Not because GPU's aren't fast enough, as you could simply lower settings and do it, but because CPU's are usually not capable of it. This is because the dev's do not design their games to run at those FPS in most cases, so they pack in more work for the CPU and just make sure it will reach 30 FPS, or 60 FPS depending on the dev's and game. When you want to go to 120 FPS, you are going well beyond what the dev's intended, so you never know what will happen.
 

king3pj

Distinguished
Part of the reason you don't see much improvement in SLI at 1080p is because a single 1080 already averages 132 FPS in PC Gamer's benchmark suite at that resolution.

NzfETwRPk9S37zTVWco9Ga-650-80.png


When you go with lower resolutions and high framerates the CPU starts to become the limiting factor in some games. If a game is already getting 130 FPS on a single 1080 the CPU might not be capable of going much higher than that. In that case a second 1080 in SLI wouldn't get you up to 175-200 FPS if the CPU is only capable of 140 FPS.

This is why you see SLI scaling improve at higher resolutions. The GPU demand increases at higher resolutions and the CPU is no longer the limiting factor. It makes sense that a second GPU would improve framerates in that situation.

As games become more demanding over the next 5 years you will see the benefits of SLI 1080s at 1080p increase assuming the majority of games still support SLI then. The steak example from earlier was a good representation of what is happening here. There just isn't enough graphics processing demand to support 2 1080s at 1080p right now.

I personally don't think the extra $200 for a 1080 is worth it over a 1070 considering that you are only gaining 15% more performance at 1080p and you still won't get 120 FPS in every game. You would be better off buying a 1070 now and upgrading to something else in a couple years.

A single 1080 is still going to cost you about $650. Assuming you can get a used one for $200 in a couple years, which would be a steal, you are still up to $850. You could buy a 1070 now and an 1170 or 1270 in the future for about the same price and that isn't even counting what you could sell your 1070 for. Not only that but the 1170 and 1270 will outperform your SLI 1080 setup.

 

Cyber_Akuma

Distinguished
Oct 5, 2002
456
12
18,785


Well, no, that's the thing. I want to know if a dual-card setup would even DO anything and if I should even bother at 1080p, or just stick with a single card.



But, wouldn't games get more demanding down the line even at lower resolutions? I mean, a card that could max out a game at 720p back when I built this PC in 2012 is probably going to struggle to max out a game at 720p today isn't it? It's not like performance at 1080p would always remain the same just because it's 1080p... or will it? If it does then I would have no need to waste money on SLI.



Yes, I am definitely starting out (or will once the prices stop being insanely inflated) with a single 1080 for now, if I even go SLI it won't be for several months from now, likely when the next round of cards are out and people are dumping their 1080s on ebay.





Wait, the CPU? I thought in modern systems the CPU has very little overhead and interaction with the processing the GPUs perform? I know the CPU obviously still has to handle game logic, AI, physics, etc, but I thought it didn't really have much overhead from the cards themselves anymore?

My CPU is a little old, it's an i7-3770K, I know that a CPU can limit a game's performance, but I didn't think a CPU could limit a GPU's raw performance anymore, and that the 3770K would still be fine for maxing out games for now. (Considering I would need a new motherboard, ram, and OS license in order to upgrade my CPU, and how big of a mess replacing the entire motherboard would be, I was still waiting for CPUs to improve some more before bothering to upgrade, the current ones still don't seem worth it to me).

----------------------

One thing I would like to point out that nobody talked about, though I can't blame them because it was kind of a quick blurb in my last post:

Also, I mentioned that I am not sure if I want to upgrade my monitor to an Asus ROG Swift PG278Q.... mainly because the damn thing is SO expensive, and I assume I am taking a risk if I try to get a used one cheap on ebay, but if I do go that route.... the monitor's native resolution is 2560 x 1440. Would a single 1080 still be able to max out games at 120FPS at that res? The benchmarks showed a SLI setup struggling with 120fps at that resolution for one of the games, I wouldn't want to just BARELY manage 120FPS with how ridiculously expensive that monitor and two 1080s would be, only to not be able to hit that anymore just a few months later when games become more demanding. Would that monitor look worse if I ran it at 1080p due to that not being it's native resolution? It would be nice to have a monitor that has more even back-lighting than my current one and supports not only 3D Vision 2 but g-sync as well, my current one only has 3D Vision 2 support, v-sync is MURDER on performance if the framerate ever dips.
 


If CPU's couldn't bottleneck performance, this would not show clear improvements the faster the CPU used:
2847596-8988251569-gtav_.jpg


The CPU not only performs AI, and AI, the CPU instructs the GPU on what to do with draw calls. They do limit GPU's quite often when going for high FPS. If minimum 144 FPS is your goal, the CPU is going to be the reason you fail to achieve it.
 

Karadjgne

Titan
Ambassador
Because of pixel size, density, resolution etc, there's only so much a gpu can throw up on a screen. You won't get such high detail, straight lines, shadow shading etc from a 720p as you would from 1080p. So you could use the most devastatingly gpu intensive game possible, and it'll still look like minecraft on a 720p (overboard example). Think of it like this. Imagine a square block with just 9 pixels. Each pixel will have but 1 solid color. To get a straight line you'd have to use either just 3 pixels in the corner or 6 pixels, it can't step. Not enough pixels. So the picture looks blocky and at a distance, muddy. That's 720p. Make it 81 pixels, and you get a very straight line that'll step between pixels easily. That's 1080p. Side affect is that the gpu now has to fill a much larger amount of pixels, so works harder. In the past, games were simpler, not as detailed, but 1080p had room to accommodate more than what a game/gpu could put on the screen, so the sli 670s were what it took to max out the possibility of that resolution for those games. Now, games are considerably more detailed, cpus stronger, gpus faster and stronger, so those massive detail amounts are bigger than what can be accomplished. The difference between very high and ultra is next to nothing to look at, much a lot of work for the gpu. Many games it's sometimes impossible to tell the difference in ultra and very high, not because the gpu can't put out the detail, but because the resolution is already so packed, the monitor itself can't reproduce the detail. This is why you move up in resolution/frequency. To get a resolution with enough pixels and speed to allow the gpu to actually show all the detail at a finer grain, maintaining clarity.
 

king3pj

Distinguished


To add to this, even with a brand new i7-6700k overclocked and a GTX 1080 some games still aren't going to reach 144 FPS maxed out. The target for most developers (and most gamers) is 60 FPS. Just about anyone would tell you that your current i7 3770k is still fine for maxing out games. They aren't thinking about 144 FPS when they make those claims though.

Here are some benchmarks from the Techpowerup Asus Strix 1080 review paired with an i7-6700k @ 4.5GHz. I'll cherry pick some of the most popular examples that don't reach 144Hz at 1080p. Keep in mind that if you look at the full review most games don't reach 144Hz. Only 6 games in their 16 game benchmark average 144 FPS at 1080p. I just didn't want to bring over every one of their benchmarks.

144Hz with max settings just isn't a realistic goal for current hardware in some games.

https://www.techpowerup.com/reviews/ASUS/GTX_1080_STRIX/

witcher3_1920_1080.png


hitman_1920_1080.png


crysis3_1920_1080.png



 


The reason 720p was used is so you can test the CPU. Higher resolutions rarely increase the demand of the CPU, and it takes the demand off the GPU. This way you can see just how fast the CPU is capable of going in a particular game.