Sign in with
Sign up | Sign in
Your question
Solved

Populating shared pci-e 16/8 slots?

Tags:
  • Hard Drives
  • Bandwidth
  • Graphics Cards
  • PCI Express
  • Storage
  • Product
Last response: in Storage
Share
December 3, 2011 10:01:12 AM

Basicly im looking at getting a revodrive 3 2x and putting it on my Gigabyte GA-Z68XP-UD4 board in the second PCI-E 16 slot.

The manufacturers website says the following, but i need further clarification.

1. 1 x PCI Express x16 slot, running at x16 (PCIEX16)
* For optimum performance, if only one PCI Express graphics card is to be installed, be sure to install it in the PCIEX16 slot.

2. 1 x PCI Express x16 slot, running at x8 (PCIEX8)
* The PCIEX8 slot shares bandwidth with the PCIEX16 slot. When the PCIEX8 slot is populated, the PCIEX16 slot will operate at up to x8 mode.

(All PCI Express x16 slots conform to PCI Express 3.0 standard.)
* To support PCI Express 3.0, you must install an Intel 22nm CPU.


I have a GeForce GTX 560 Ti in the PCIEX16 slot. so its currently running @ pci-e 2 16 lanes.

considering the revodrive is a pci-e 2 4x card, and in the PCIEX8 slot, what is the situation with the bandwidth of the system?

1: splits the number of lanes evenly between the 2 16x slots, thus 8x each and uses up to the maximum bandwidth of pci-e 2 8x.

2: splits the number of lanes to the cpu dependant on the cards, i.e. 12x for the gpu, and 4x for the ssd?

3: they simply share the maximum bandwidth threshold of the pci-e 16 lanes, so when one card is not sending/receiving the other can utalise the bandwidth, and/or with the exception that the second card not excees 8x ( half bandwidth )?

i will be upgrading down the line to Ivy bridge, so the pci-e 3 should allow my current card to take advantage of the increased bandwidth per lane (?correct?), but will this pose a significant bottleneck for this card, in the mean time?

also, what kind of bandwidth to/from the graphics cards is utalised? for example, if my graphics card- being 16x pci-e 2 is only utalising say 40% of the bandwidth available across the pcie2 interface (pci-e 2 = 500 MB/s per lane in each direction, so say 3.2GB/s for 16 lanes at 40% ), then i dont see it being a problem, even in the worst case (8x/ 8x dedicated lane split) and i'm gonna go for it. otherwize if it uses the lot 100% of the bandwidth ( on high to peak load ) then i might have to rethink my situation.

I know many people would think the revodrive 3 2x overkill, but if i can do it without upgrading any other components of my system atm then that would be advantageous.


Given the above scenario ( im not going to lie, this is a yet to be built build, i just dont want to argue over validity of the revodrive choice ( price/effect compaired to SATA3 raid configs ) and would rather focus on the ABILITY of the above being a VIABLE choice? ) is it possible?

now that thats said here are the other specs for those inclined to look.

2 x 1080P 24inch monitors, + 36" panasonic viera plasma ( hdmi to mobo )
Gigabyte GA-Z68XP-UD4
Gskill 16G 2133 C11 XM
Core i5-2600K ( Aiming for max 4.8 Ghz OC, at 4.4ghz + 1,2,3,4 turbo, will settle for 4.2. )
--- Revodrive 3 2X 240GB ---
Western Digital 2TB SATA3 Green
Gigabyte GTX 560 TI, GV-N560OC-1GI
LG BH12LS38 + LG CH12LS28 BR, DVD, CD Drives
Kuhler H2O 620 CPU Cooler

including, but scecific part yet to be confirmed ( depends on final hardware config )
!UPS!
600w? PSU
case?

usage: graphic design (photoshop mainly but some video+encoding, some flash), some basic 3d modelling, database driven programming CUDA and openCL, Gaming, folding@home, as media center for blue ray movies. audio playback ( have 5.1 sss, + my old audigy card which im ditching in favour of onboard sound. )

not entirely sure which category to post this in, if its out of place and i can move it somewhere better just give me a heads up.

can somebody help me out here? im hoping i dont have to go back to the drawing board and get a x58, or some crap to run a 4x pci-e...

Oh this is my first ever post on toms hardware, after reading this site on/off over the last 10 years, you guys are super helpfull, thanks!

More about : populating shared pci slots

Best solution

a b U Graphics card
a b G Storage
December 3, 2011 10:35:42 AM

I'm not entirely sure what your asking, but here is what I know. On your motherboard if you only have one PCI-e slot populated (the first one) it will run at 16x. That means if you just plug in your graphics card it will be running at 16x. If you have two slots populated (the first and second) then both slots will run at 8x. It doesn't matter whether the cards are actually using the bandwidth, they will always run at 8x.

The real question is does this matter. I don't think that your a GTX 560 needs a 16x lane anyway.

Have you made sure that your motherboard is supported by the Revodrive? Not all boards support booting from a PCI-e hard drive.
Share
December 3, 2011 10:43:46 AM

Thanks for the reply,
thats exactly what i was after.

Just not sure how the technology works, i can picture splitting 1 16x lane to 2 8x lanes when youve got 2 16x graphics cards installed, you know, 50/50.

but when 1 is a 4x and the other a 16x, then im really just guessing, and guessing always cause problems..

i havent checked out compatibility with the mobo however, thanks for that heads up, im going to check it out now.

cheers.
m
0
l
Related resources
December 3, 2011 3:35:54 PM

Ok, so the gigabyte board is not on the official support list ( damn ), so im going to switch to a supported one, the asus P8Z68-V PRO GEN3,
and grabbing an asus ENGTX560 Ti DCII TOP/2DI/1GD5 for the graphics card instead of the gigabyte one.

BONUS: the board has a seperate 4x pci-e so i dont have to compromise on the shared bandwidth issue + can now do sli if i want ( future proof )
also found out about the whole gigabyte pcie gen3 compatibility deal. so thats a headache i wont have to deal with in the future.
problem solved.

Thanks jsrudd for the heads up, i never thought to check out the revodrives support list.

Cheers.
m
0
l
December 3, 2011 3:36:09 PM

Best answer selected by kAzure.
m
0
l
!