1080 or 1070 SLI?

Toxic_Cobra

Honorable
Jan 9, 2016
243
0
10,710
I just built a new pc with a 1070 and pretty soon im getting a ultrawide 1440p monitor, can my 1070 stix oc edition play on max settings with a ultrawide? if not, should i buy another 1070 or just buy a 1080 and try to get $300 for my 1070? Would I be able to play at 75fps+ on ultra with 1070 SLI?
 
Solution
In answer to both your posts:

For the bridge I've found usually dual old-style ribbons works just fine for 4k/60hz and 2560x1440 144hz. Nvidia's promo materials state that's pretty much the threshold before you really need the HB. I spent a lot of time trying to find data really showing facts. Here's the 2 most useful reviews I found:

http://overclocking.guide/nvidia-hb-sli-bridge-technical-review/
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/

So if you do go to SLI, the old ribbons should do fine for 60hz Ultra wide 1440p. If you change your mind and go to 120-144hz Ultra wide HB might help then. The reviews also state, that like most things, many programs/engines are not equally optimized yet for HB. Basically...

Ethanh100

Honorable
I run 5760x1080 so that is 6220800 pixels, and 3840x1440 is 5529600 pixels, so that is a 13% deficit. I play AAA games at Ultra settings, and get an average of 50-70, but do get dips into the 40s frequently. 1070 SLI will destroy a 1080, even if the game doesnt scale too well. That being said, a 1080 is probably enough power to get 60+fps all the time, without dips, and would be cheaper. So I would say sell the 1070 for $300-350 and get a 1080.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I recently got a pair of Gigabyte Xtreme 1080's in SLI(my other specs below). For 1440p G-Sync minus AA but everything else Ultra I average 90-150FPS. This is tested with Witcher3, GTA V, ROTTR, No Mans Sky, Crysis 3, Shadow Of Mordor. In cinema 4k(4096x2160) I average 54-80 with the same settings(fixed refresh w/ v-sync on).

A pair of 1070's would be more economical for you and should only be around 10-15% less fps. In your place I would recommend the 2nd 1070 unless you really have the money to spare. I was upgrading from a pair of G1 970's.

p.s. recently got an evga HB sli bridge, had issues. Still using 2 ribbons but have an actual Nvidia on the way. . .will update on any difference since supposedly 4k/ 1440p 120hz+ is the turning point for HB.
 

Ethanh100

Honorable


2 1070s wouldn't be more economical. he would have to buy another 1070 for ≈$430, vs selling a 1070 for 300, and buying a 1080 for ≈650, coming out paying 350. Also SLI 1070s will be much faster in games that scale well than a 1080, and in games that still dont scale well it will likely out perform it. THe 1080 is only ahead of the 1070 by a little bit.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


I misread the original post and was thinking he was considering 1080's in SLI vs 1070 SLI. My bad. The only real issues are:

1. At a time when supply is beginning to meet demand: are you really going to be able to get someone to pay $300 for used vs the $430 for new in order to upgrade to a 1080?

2. With any SLI it's usually recommended you have a PSU that's rated around double your actual usage so it's in its optimal power usage. I didn't see a post with the PSU being used. For 2 1070's probably looking in the 750-900w range and for one 1080 around 650-750w and that's all relative to what other components, CPU, Mobo are being used.

With a quick search I didn't find any SLI strix 1070 reviews or ultra wide but here's a few showing the roughly 5-10% difference I mentioned in both 4k and 16:9 1440p. This obviously doesn't show the scaling potentials but does show some consistent performance numbers for single cards thus avoiding possible SLI complications.

http://www.bit-tech.net/hardware/graphics/2016/06/21/asus-geforce-gtx-1070-strix-review/9
http://www.guru3d.com/articles_pages/asus_geforce_gtx_1070_strix_gaming_review,21.html
https://www.bjorn3d.com/2016/07/asus-rog/6/

If you already have the card and the system is put together and running, why not simply wait and see what happens when you get your new display? Unless there's a time-frame relating to when you can sell the card.

At any rate, SLI gives at most close to double and that's sometimes pretty rare, it's more like 1 3/4 or 1 1/2 increased performance. There are some exceptions but that's pretty close to accurate. I've personally done SLI with 560ti's, 970's, 980m's, and now these 1080's. Even a game as horrible as Batman AK gets 80-90fps in 2560x1440 with all settings maxed(including gameworks) and the non-sli use and 50-60 4k.

 

Toxic_Cobra

Honorable
Jan 9, 2016
243
0
10,710


I have a 750 wat gold+ PSU, I think you're right about not being able to sell my 1070 for $300 in a few months, so i'm likely going to SLI it and hope to get decent frames. What SLI bridge should i use? (1440p ultrawide 100hz) does it matter which slot the second card is in? like should i do first and second slot or first and third (so the first card can get some fresh air)?
 

Toxic_Cobra

Honorable
Jan 9, 2016
243
0
10,710


I was thinking about keeping my setup now (1070, 6800k, 1080p @ 60hz) and wait for the 1080 ti. Do you think one 1080 ti will be able to run a acer predator x34?
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
In answer to both your posts:

For the bridge I've found usually dual old-style ribbons works just fine for 4k/60hz and 2560x1440 144hz. Nvidia's promo materials state that's pretty much the threshold before you really need the HB. I spent a lot of time trying to find data really showing facts. Here's the 2 most useful reviews I found:

http://overclocking.guide/nvidia-hb-sli-bridge-technical-review/
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/

So if you do go to SLI, the old ribbons should do fine for 60hz Ultra wide 1440p. If you change your mind and go to 120-144hz Ultra wide HB might help then. The reviews also state, that like most things, many programs/engines are not equally optimized yet for HB. Basically most of the market has just now really gotten comfortable with DX11 stuff and only beginning to tinker beyond it.

As for the 1080ti. . .It's still a bit of a mythical beast. I was waiting for it and decided to take the plunge for the 1080's I have now(v-ram is an ongoing concern of mine). I read numerous posts from fakes to educated guesses to pure speculation on its possible specs. If it does come out(anywhere from Dec-March by some guesses) it should be able to rock that Acer no problem but it would probably be a 1-2 month wait before you could actually get one. Even a single 1080 could mostly hold that but would probably dip into the 40's on some occasions.

One of the more believable posts I read regarding it was that the new Titan was actually supposed to be the Ti but there was some kind of production limitation on something so they decided to hold off till the next generation. There are also many who believe that Nvidia will hold out until AMD releases their next high-end cards. Nvidia also has an established history of releasing x80Ti's around 6 months after the first runs. In the present/nearest future Nvidia seems to be flooding the lower tier with 1060's and now rumored leaked specs of 1050's.

If you want to be done before the holidays and enjoy the time then I recommend the 2nd 1070. That PSU should be just fine. I mostly agree with EthanH100 about them vs a single 1080 as well. If you want to be able to go beyond that Acer you'll need the Ti/1080sli/or whatever comes next but that also can have other caveats. With a quad core you're limited to x8/x8 for SLI and as we approach 4k/120hz or 5k those buses are reaching some limits that the bridges can't necessarily help with. The only real away beyond those issues are a single x16 card or going with more cores/pcie lanes.

BTW I tinkered with 5k DSR with my setup and most things dropped into the 40's and at 8k it went into the 20's. Again, everything at max minus AA using the old ribbons.

 
Solution

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


I'd forgotten that the 6800k was a hex. I'm so used to seeing x30 or x60 that I didn't catch that in your previous post. That chip was somewhat intended to be an economical way of giving some of the benefits of hex at a lower price point. The main idea was one strong card @ x16(for >4k/60hz like a Titan/TI) then the remaining 8 could be used for SSD(s) and other peripherals or as you're considering 2 x8(1070/1080's) for ultra wide 1440p/60hz. Dual x8 pcie 3.0 is still very viable but in the next 1-3 years it will see its limitations.

You'll be fine for now. The most accurate reports I'm reading so far state the need for 2 x16 when dealing with 5k-60hz/4k 120-144hz, neither of which are readily available for most people. The bridges mostly will help with micro-stutters but the majority of the info is still carried on the bus. Its mentioned in those reviews I posted above. By the time 5k/8k happens the next pcie standard should also be available and we'll all need to upgrade by then. I do believe that 2 x16 will give a few extra fps for SLI but even ultra wide 60hz isn't going to need it unless you have two such displays and then even a single TI may have limits.

Most guestimates for the TI prices are ~$850-1000 but no one really knows for sure. I totally understand why you're leaning towards a TI with the 6800k now. The real question for the TI is if and when and that's still heavily speculated. It would be a perfect match for you though.

In either case if any of the guesses are to believed the most significant issue with 2 1070's vs the myth of TI would be Vram. Rendering wise they'll likely be close(that goes to what EthanH100 was talking about-less cores but faster clocks) but the TI should have more ram and I read some technical stuff saying they couldn't do 10GB and would have to do 12 or more. IDK but its worth noting.

 

Toxic_Cobra

Honorable
Jan 9, 2016
243
0
10,710
Sorry if this doesn't make sense, I haven't don't too much research on pcie-lanes. If I have 28 lanes, couldn't I run two 1070s one x16 and one x8? 16 + 8 = 24, would the 4 remaining lanes be enough for my SSD/peripherals? I have a 2tb wd black HDD and a 996gb Sandisk extreme pro SSD. I'm guessing peripherals is keyboard/mouse and I have a corsair k95 RGB, a Razer Naga chroma, and m50x's.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


As far as the cards go. . .it really depends on the MOBO. They will usually parse the lanes as needed automatically but some allow manual settings in the BIOS. In Nvidia CP you can leave it to automatic/optimal as well. For SLI its normally recommended to have both the same but its not required(might cause some micro-stutter issues but I'm not positive).

The peripherals only really matter with the ones plugged via a PCI or PCIe port not SATA/USB/Firewire/PS/2. Those are usually done through a separate bus.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
Honestly, though, it shouldn't matter for 2 x8 with UW 1440p even if you go to 120-144hz. The futuremark site shows a lot of people with impressive scores using 2 x8 setups.

From what many have stated the ceiling starts around 4-5k/120+hz.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
Thanks for the vote. I think if you get the 2nd 1070, you'll be happy. If you want to keep the costs down just use the 2 old style ribbons. That setup should take care of the X34 and then some. If you should change you're mind on the monitor I still recommend staying with G-sync for NVidia cards, just read from the Nvidia site for the right settings. Their last post is a little old but has worked for me(in-game vsync off and in CP gsync & vsync On). I'm a big fan of Asus/Acer, I just wanted to see what 1ms 144hz G-Sync was like which is why I have my current one and its awesome(Tom's has a great review of it).

If you order an HB(I recommend waiting since the reviews aren't showing much in terms of performance gains right now and they emphasize it helping with more consistent FPS than gains and less micro-stutter) but make sure you have the spacing right. NVidia has a different definition of it than others. The first thing is to measure the distance from each SLI port to the other on the other card in mm to compare with the bridge specs. I checked on Amazon for mine and found a few different ones. I got the EVGA and ended up with some of the issues others reported(cards weren't always recognized and sometimes had to shift it slightly to get it to work). I think its a hit or miss q/c since some reported no problems. The spacing was right but it seemed like the connectors weren't making full contact. For the $1400 I've got into those cards I really didn't like that.

My personal tests for 4k/60hz and 1440p/144hz were within 3fps just like the reviews so I went back to the ribbons. I do have an NVidia one coming by Friday and I'll let you know if that's any better(in terms of functionality/quality there were much better reviews for it but FPS was about the same). It did seem like the frames were more steady also but I want see the NVidia in action before my final judgment on HB for this level displays.

If you decide to wait for the TI or whatever comes next your system will be happy with it too, it's just a big unknown for the true release/availability/specs/cost.