As some of you may know the chipsets that go with Intel socket 1156 have an inherent 16 lanes of PCIe limitation. This is fine for one graphics card.
When attempting to integrate USB 3.0, motherboard manufacturers have run into problems, as the USB 3.0 controller really needs a PCIe 1x lane but there are none spare.
What many of the new Gigabyte motherboards have done is cut the graphics card down to PCIe 8x (even though its in the 16x slot) freeing up spare PCIe lanes for the USB 3.0.
There was a review on Toms Hardware of Gigabyte's implentation (along with Asus'), however sadly no attempt was made to analyse the graphics cost of cutting a single card down to PCIe 8x in that review (or if it was I missed it).
So can anyone help me? If we cut say an ATI HD 5770 down to 8 lanes of PCIe what's the performance hit? Or any other card? Does the hit depend on the card? Or the graphics its displaying?
Any evidence or references with your answer would be a nice extra.
Oh, I have another question. Let's suppose I get a Gigabyte USB 3.0 board and an HD 5770 card on it. I am happy. Fast forward.
Imagine its September 2011, and I'd like to get a new graphics card (a modest upgrade perhaps). What aspect of the potential future new card's specification would I look at to determine how badly it would be affected by the PCIe 8x limitation?