Can PCIE 2.0 run with 16x ?

Boss Big

Prominent
Apr 12, 2017
14
0
510
Hi Guys!

I have a i7 2600K and a GTX 1070 Gaming X. I noticed in GPU-z that the GPU is running Gen 2 x8. As far as I know the 2600K only support Gen 2.0 no 3.0 unfortunatelly.

But my Mobo has 3.0 x16 slots, so my question is:

Did I set something wrong? Or the PCIE 2.0 should be x16 instead of x8?
Or the x16 is a Gen3 feature as well?
 
Solution


False (or maybe poorly worded?). x16 is not a Gen3 feature.



That board supports SLI, as per: http://www.biostar.com.tw/app/en/mb/introduction.php?S_ID=582#specification
So there is (at least) a primary x16 (that'll operate in x8 in SLI) and secondary x8.

As per the manual (can be downloaded from that link), you're primary 16_1 is the one closest to the CPU. If you're installed in there, run the stress test as suggested by clutchc.



No, it's not worth it...

Barty1884

Retired Moderator
No, x16 is not a Gen3 feature.

Some older MB's downclocked the bus to x8, x4 and sometimes even x1 when it's not needed - and increases as required (ie gaming).

Which board do you have specifically? Some are wired to have an x16 slot (that drops to x8 when used in SLI) and a dedicated x8 (for a secondary card in SLI). You may simply have your GPU in the x8 and not x16 - assuming you're running a single card.

Ultimately, neither Gen2 nor x8 are going to be limiting you in any noticeable way.
 

Boss Big

Prominent
Apr 12, 2017
14
0
510


I have Biostar tz77xe4 board.
 

Boss Big

Prominent
Apr 12, 2017
14
0
510
Ok guys!

I read the manuals, and it seems the 1st PCIE slot is 3.0/2.0 x16 and x8
The 2nd is 3.0/2.0 x8 only.

And unfortunately I can't install it in the 1st PCIE, because I have a Noctua NH-D15 cooler, which is way too big.

I am thinking about changing the CPU cooler. Is it worth it? I mean the difference to gain the 2.0 x16 instead of the x8?
 

iamacow

Admirable
PCIE 2.0 runs 16X bandwidth which is 8X of PCIE 3.0

Your motherboard fully supports PCIE 2.0 16x which is just fine for everything but 4k Gaming. Your second slot is most likely electrically only 8X.

You will see a slight frame rate increase with a 1070 using 16x @ 1440P but really your CPU is holding you back already.
 

Barty1884

Retired Moderator


False (or maybe poorly worded?). x16 is not a Gen3 feature.



That board supports SLI, as per: http://www.biostar.com.tw/app/en/mb/introduction.php?S_ID=582#specification
So there is (at least) a primary x16 (that'll operate in x8 in SLI) and secondary x8.

As per the manual (can be downloaded from that link), you're primary 16_1 is the one closest to the CPU. If you're installed in there, run the stress test as suggested by clutchc.



No, it's not worth it (it's not likely to be even noticeable, let alone "worth it").
 
Solution

Boss Big

Prominent
Apr 12, 2017
14
0
510


Thanks. But as I mention I can not, my CPU cooler is way too big. The VGA does not fit.
Is it worth to change the CPU cooler to something smaller or different ?
I really like this cooler, so I don't know how much would I gain with it. //

Edit: OK, thanks. Then I will stay with Gen2 x8. :)
I am statisfied with the Bechmarks so far
 

iamacow

Admirable
PCI Express
version Line
code Transfer
rate Throughput
×1 ×4 ×8 ×16
1.0 8b/10b 2.5 GT/s 250 MB/s 1 GB/s 2 GB/s 4 GB/s
2.0 8b/10b 5 GT/s 500 MB/s 2 GB/s 4 GB/s 8 GB/s
3.0 128b/130b 8 GT/s 984.6 MB/s 3.94 GB/s 7.9 GB/s 15.8 GB/s
4.0 (expected in 2017) 128b/130b 16 GT/s 1969 MB/s 7.9 GB/s 15.8 GB/s 31.5 GB/s

https://en.wikipedia.org/wiki/PCI_Express

Basically 16X PCIE 2.0 is 8Gb/s where as PCIE 3.0 8x is 7.9 Gb/s
 

Barty1884

Retired Moderator
You'd replied while I was replying, so I'd editted that post - you'll see my last comment there:

"No, it's not worth it (it's not likely to be even noticeable, let alone 'worth it')".

As iamacow mentioned, might result in a frame or two difference...... but I personally wouldn't quantify that as 'worth it', especially considering you'd be out the cost of a new cooler.
 

maxalge

Champion
Ambassador


use a pcie riser cable

maxresdefault.jpg
 

iamacow

Admirable
A PICE raiser is asking for problems! Those miners barely use 1x which is how they get away with such long cable. A good PCIE riser can only do 6inches and cost $50.

I think the OP is perfectly find with the 8X PCIE 2.0 with that CPU. the biggest boost will come from a new setup rather than switching to a 16x slot. I've ran my 1070 on both 16x PCIE 2.0 and PCIE 16X 3.0 and saw zero frame rate difference @ 1080P and 1440P. the GTX 1070 isn't very good for 4k anyways so the framerate was already 35-45fps in most games. Couldn't get a good reading on that.

You can literary gain 15-20fps switching from that 2600K to something of this generation. Of course it depends on the game, but a lot of them are still limited by clock speeds and not thread count.