Confused about PCIe lanes on my motherboard

localstarlight

Distinguished
Feb 12, 2016
13
0
18,510
I'm putting together a new build, and confused about how the PCIe lanes work on the motherboard I've chosen.

I'm going to be using the Asus Deluxe II, with a 40 lane CPU (i7 6850K), and two EVGA 1080 GPUs.

The GPUs are 16 lanes each, and I need Thunderbolt so I'm getting the Asus ThunderboltEX PCIe card which takes up 4 lanes, leaving 4 lanes remaining.

Since the mobo has an M.2 socket, I opted for my main drive to be a Samsung 950 PRO NVMe M.2 512 GB SSD, which would take up those last 4 lanes.

However, I was just looking at the installation guide for the mobo (here) and it seems to say that on a dual GPU setup on 40 lane CPU, to have the GPUs at 16 lanes each means the M.2 socket is disabled. Why is that?! Is that normal?
That seems strange when two GPUS should only take 32 lanes, leaving 8 for other things.

Does this mean I'm going to have to have all my drives being SATA and can't take advantage of M.2?

(First time building a PC so I'm a bit clueless – thanks!)
 
Solution
x8 vs x16 has negligible impact on GPU performance.


You'd have to read the manual on the motherboard (and I detest ASUS and their slow website, so I'm not doing it) to see how the PCIe lane allocations works. Apparently you already did, so I suspect your analysis is correct.

Just a terminology nitpick. 4x means "four of" and x4 refers to PCIe lane width. I kind of did a double take reading that you were planning to install 4 Intel 750s, until the penny dropped. I'm kind of slow. :)

Blocking fans on GPU cards is usually not a great idea. Can't be very specific, but usually GPU cooling is already up there as a major consideration.

localstarlight

Distinguished
Feb 12, 2016
13
0
18,510
OK, so I've done some more digging and it seems like the M.2 socket shares bandwidth with the PCIe lane 3, making the 2nd GPU run at 8x instead of 16x.

I'm not entirely clear how much of a practical difference that makes though... how crippled will a 1080 GPU be when stepped down to 8x instead of 16x?

But suppose I went a different route, and had the following setup:

PCIe 1: GTX 1080 (16x)
PCIe 2: –
PCIe 3: GTX 1080 (16x)
PCIe 4: Asus ThunderboltEX 3 (4x)
PCIe 5: Intel 750 Series NVMe PCIe SSD/Solid State Drive 400GB (4x)

Adds up to 40 lanes, so is that ok?

It seems like the ThunderboltEX 3 card will block one half of one of the fans on the second GPU (these are aftermarket EVGA cards), is that an issue? If that's an issue, is there some way I could plug the Thunderbolt card into PCIe slot 5 (so there is a gap) and connect a PCIe SSD to slot 4 using some kind of cable/adaptor with the drive in a different location?
 
x8 vs x16 has negligible impact on GPU performance.


You'd have to read the manual on the motherboard (and I detest ASUS and their slow website, so I'm not doing it) to see how the PCIe lane allocations works. Apparently you already did, so I suspect your analysis is correct.

Just a terminology nitpick. 4x means "four of" and x4 refers to PCIe lane width. I kind of did a double take reading that you were planning to install 4 Intel 750s, until the penny dropped. I'm kind of slow. :)

Blocking fans on GPU cards is usually not a great idea. Can't be very specific, but usually GPU cooling is already up there as a major consideration.
 
Solution

localstarlight

Distinguished
Feb 12, 2016
13
0
18,510
Hey, thanks for your response.

So what I realised is that the U.2 connector seems to be unaffected by the 2nd GPU, so I could get a drive like the NVMe 2.5" U.2 PCIe SSD 400GB, have the Thunderbolt on the lowest PCIe so there is a gap for the fan, and all should be good (I think).

Thanks for pointing out my terminology mistake – good to know!
 

__SID__

Commendable
Jun 1, 2016
40
0
1,560
I ran my Nvidia 1070 on 8x and 16x, there was NO difference in gaming or rendering. My mobo was an X99 Deluxe U3.1 with 5820K which only has 28 lane support.
 

jlottm3

Commendable
Sep 12, 2016
1
0
1,510
I would be curious to see if you ever got the thunderbolt card to work with your setup as mine is identical to yours except that I am using (2) U.2 drives with (2) GTX 1080s.

When I enable the tbolt card, the machine will not post which requires a CMOS clear to get it to boot