PCI-e 3.0 and Crossfire/dual-GPU

Status
Not open for further replies.

Murderotica926

Honorable
Dec 25, 2012
88
0
10,640
Working on a build that I'd like to be at least somewhat future-proof. The only piece I already have is the HD7970, which I'm essentially building the whole system around. I already know that PCI-e 3.0 will only give me about a 1% increase in performance over PCI-e 2.0 as far as my single HD7970 goes. But down the road when my 7970 is starting to get a little outdated, I plan on CrossFiring it. So what I'm wondering is whether PCI-e 3.0 has a more detectable advantage over 2.0 when running dual graphics cards. Not interested in 3 or 4 cards, just 2. I could save a good deal of money getting a MB and CPU that use PCI-e 2.0, I just don't wanna be kicking myself down the road.
 
Solution
your mainboard seems ok for making cf with 2 cards in the future if you like...it would be pretty much future proof as it would offer 2 x8 lanes in cf.

remember the 1st pci-e slot is gen 3...so even the x8 lane = pci-e gen 2 x16

you wont get any bottleneck with 2 hd7970

COLGeek

Cybernaut
Moderator
What motherboard are you using? The number of lanes your x16 slots will determine your future multi-GPU capability.

Regardless, a CFed HD7970 config is pretty impressive in terms of graphics horsepower, even if you don't have 2 full x16 lanes (many mobos will do x16 + x8 or even x8 + x8).
 
x8/x8 with even two Radeon 7990s instead of two Radeon 7970s wouldn't be bottle-necked with PCIe 2.0 by too much in most games. Two 7970s shouldn't have any trouble with PCIe 3.0 x8. If there's more than a 5% drop in even the most PCIe-sensitive games such as the DiRt games, I'd be surprised.
 
your mainboard seems ok for making cf with 2 cards in the future if you like...it would be pretty much future proof as it would offer 2 x8 lanes in cf.

remember the 1st pci-e slot is gen 3...so even the x8 lane = pci-e gen 2 x16

you wont get any bottleneck with 2 hd7970
 
Solution

delellod123

Honorable
Jun 5, 2012
1,094
1
11,660
I just wanted to chime in here. I rarely respond of forums, but felt my input may help people.
I run x79 system. Here are specs before i continue:

Mobo: sabertooth
cpu: 3820 @ 4.5
GPU: (2x) gtx 660 ti 2gb (in sli)
ram: 16gb (quad channel)
hd: m4 128 ssd (primary) | 1,5 tb wd Black | seagate Barracuda 750 gb
psu: corsair 850 watt 80 Platinum


I recently started to play the game Sleeping Dogs. I was getting lower frame rates than I had expected when the settings were all on max. The only way for me to get smooth 60 fps, was to lower the AA to HIGH rather than EXTREME. I figured this may be due to memory bandwidth or possibly a cpu bottleneck, though I was not sure.

Yesterday however, I ran nVidia's PCIE 3.0 reg hack via DOS. Honestly, I was not aware nVidia had dropped x79 support for PCIE 3.0. After a quick power down/up GPUZ was registering PCIE 3.0. Happy to see that, I ran Sleeping Dogs to see if there was any improvement to FPS on EXTREME. Well, to my astonishment, there was a HUGE improvement!

The benchmark, which on PCIE 2.0 would display an average of 42 FPS and a min of 22 FPS, now increased dramatically. On PCIE 3.0, with the same settings, (vsync ENABLED) I got an average of 58 FPS, High 64, low of 53.

Now I wont pretend to know exactly what caused this, nor did I do additional tests yet, but this result seems to be caused by enabling PCIE3. I did nothing else here, other than the PCIE3.0 reg hack. I am stunned by the result. Especially since all my other reading seems to suggest there is very little increase in FPS from 2.0 to 3.0. Possibly it is just the shader-intensive game and the high level of AA? any thoughts?
 
The slight change in PCIe performance may have affected V-Sync in some way or some other of dozens of possibilities. There are many possibilities for why changing PCIe like that helped so much despite the fact that it shouldn't have helped much and without a lot of testing, I can't say for sure why it helped.

Do you play any DiRt games? I've noticed them to be particularly sensitive to PCIe bandwidth and if they're similarly affected, then it might be evidence for Sleeping Dogs being very PCIe-sensitive too.

Some compute-oriented workloads are also very PCIe sensitive and Sleeping Dogs is one of the first few games to have support for some intensive compute-oriented features. I doubt that this is the situation here, but it is possible.
 

delellod123

Honorable
Jun 5, 2012
1,094
1
11,660
After work today I will try running a few tests. Needless to say, I am limited to what I can use to test with. My first thought would be to run a msi Kombust test, using the 1080P. Especially the "combined test." This is where I noticed the SLI struggle the most. Other than that, I have a few games that I personally noticed a struggle at high AA settings. at my disposal, including:

Hitman Absolution (bright areas at 4x MSAA bogged down FPS, especially in Epilogue)
Farcry 3 (4x AA @ 1080p fps dip)
Crysis 2 (4x aa @ 1080p fbs dip in intensive scenes)
Metro 2033 (on benchmark, I believe it was the High MSAA setting that caused massive dips, but I need to see it when I am home again)
Batman AC (poor optimization, thumbs down)

I have many others, but these seem to stress the GPU's the most. I will try DIRT, though I have it, I never played.
Also, I guess it would be obvious I should try Sleeping Dogs with Vsync disabled. Once I have some results, I will post.

Just thought this was a very interesting result. I was Very suprised/happy, since I really like to max these games out to get full visual effects.

BTW, in case anyone was wondering, I am on the 310.70 WHQL driver. on a 1080x1920 resolution. GPU's are at factory settings (msi power editions factory OC).
 
Status
Not open for further replies.