Sign in with
Sign up | Sign in
Your question

PCIe x16 or PCIe2.0 card on PCIe slot

Last response: in Graphics & Displays
Share
June 27, 2008 7:45:03 PM

How much would the performance of an 8800GT pcie 2.0 card would decrease if i run it on PCIe x16 slot? would a 8800gt pcie 2.0 give the same performance as a 8800gt pcie x16 card on x16 slot ? :pfff: 

Cheers
June 27, 2008 7:46:51 PM

Unless you're gaming at over 1920x1200 I don't think it'll decrease at all.
June 27, 2008 7:54:11 PM

why only over 1920x1200? I heard the basic difference between the two types of slots (pcie x16 and pcie 2.0) is the increase of voltage from 1.0 to 2.0, so should'nt that uniformly decrease the performance of a pcie2.0 card making it equal to that of a similar pcie x16 card ?
Related resources
June 27, 2008 8:05:00 PM

There's no change in voltage. 2.0 has double the data rate of 1.0, so a x8 2.0 slot will be the same as a x16 1.0 slot. x16 1.0 has a data rate of 4 GB/s, which is more than enough for any game running at resolutions below 1920x1200. At higher resolutions the system needs to feed more textures to the card more often, and that's where the extra bandwidth starts to make a difference.

But that's only if the card can make use of the extra textures fast enough in the first place, and I don't think the 8800GT could. The difference between 1.0 and 2.0 will get more noticeable in the future, when cards can burn through more texture data at a faster pace.

The 9800 GX2 doesn't start losing performance until you reduce the PCI-E 2.0 link to x4, which is the same as a x8 1.0 link. So an x16 1.0 link won't deliver any less performance than an x16 2.0 link.

Check out this review:

http://www.tomshardware.com/reviews/pci-express-2-0,191...
June 27, 2008 8:13:36 PM

oh btw i'd install any of the above three on a PCIE x16 slot....
June 27, 2008 8:17:03 PM

I'd probably go with the MSI card. But if you're looking in that price range anyway you should check out the HD 4850 too.
June 27, 2008 8:29:26 PM

hmmm, i think the MSI one would be just fine. More of a Nvidia guy than ATI.
!