PCI Express And SLI Scaling: How Many Lanes Do You Need?
-
Page 1:The PCIe Bottleneck?
-
Page 2:Test Setup And Benchmarks
-
Page 3:PCIe Scaling: 3DMark Vantage
-
Page 4:PCIe Scaling: Alien Vs. Predator
-
Page 5:PCIe Scaling: Call of Duty: Modern Warfare 2
-
Page 6:PCIe Scaling: Crysis
-
Page 7:PCIe Scaling: DiRT 2
-
Page 8:PCIe Scaling: S.T.A.L.K.E.R.: Call Of Pripyat
-
Page 9:PCIe Scaling Summary
-
Page 10:SLI Scaling: 3DMark Vantage
-
Page 11:SLI Scaling: Alien Vs. Predator
-
Page 12:SLI Scaling: Call Of Duty: Modern Warfare 2
-
Page 13:SLI Scaling: Crysis
-
Page 14:SLI Scaling: DiRT 2
-
Page 15:SLI Scaling: S.T.A.L.K.E.R.: Call Of Pripyat
-
Page 16:SLI Scaling Summary
-
Page 17:Conclusion
PCIe Scaling: Crysis
Experience tells us that Crysis is usually GPU-limited, and it appears that bandwidth limits are far less of a problem as resolution is increased.
The x4 slot suffers a 9% performance handicap at 1680x1050, while the x8 slot allows the GPU to reach 98% of its performance potential. That is to say, the mid-sized slot looks like an acceptable option for Crysis.
Summary
- The PCIe Bottleneck?
- Test Setup And Benchmarks
- PCIe Scaling: 3DMark Vantage
- PCIe Scaling: Alien Vs. Predator
- PCIe Scaling: Call of Duty: Modern Warfare 2
- PCIe Scaling: Crysis
- PCIe Scaling: DiRT 2
- PCIe Scaling: S.T.A.L.K.E.R.: Call Of Pripyat
- PCIe Scaling Summary
- SLI Scaling: 3DMark Vantage
- SLI Scaling: Alien Vs. Predator
- SLI Scaling: Call Of Duty: Modern Warfare 2
- SLI Scaling: Crysis
- SLI Scaling: DiRT 2
- SLI Scaling: S.T.A.L.K.E.R.: Call Of Pripyat
- SLI Scaling Summary
- Conclusion
You spend unnecessary $$$ on a x58 platform while I save money that I can put towards a GPU upgrade with my p55 platform
You spend unnecessary $$$ on a x58 platform while I save money that I can put towards a GPU upgrade with my p55 platform
anyone know if 4850's are going to be unavailable any time soon? You could get the 3000 series for quite awhile after the 4000's released so I'm crossing my fingers until i can afford a cpu upgrade and another 4850
cpu is currently a 7750BE and so im pretty sure it would bottleneck the 4850's. I think it does with just one actually.
It's not the game's fault. The GPU can only go as fast as it was made to go. So in simple terms you could say that GPUs these days aren't "fast" enough to use all the bandwidth PCI Express offers.
On a card for card basis they are still quite a bit more powerful than the GTX 480 and should require the most bandwidth of any current card for maximum performance.
The first article tested CrossFire scaling with three 5870's:
http://www.tomshardware.com/reviews/p55-pci-express-scaling,2517-2.html
1.) It appears that the GTX 480 runs out of CPU faster than the HD 5870. 2.) It also appears that the biggest difference between games is how hard they hammer the GPU, based on details, lighting effect, etc.
3.) The result is that you're seeing an FPS cap from either the board or the CPU as the load shifts away from the GPU to other components. The good news is that this "cap" is higher than the "minimum playable" frame rate most people can tolerate, in most games.
So, what does this have to do with your question? The HD 5970 uses a PLX Bridge: http://www.tomshardware.com/reviews/radeon-hd-5970,2474-2.html
Both GPUs get the same data, and the PLX Bridge simply doubles it from one set of lanes to two GPUs. So, an x16 slot turns into two identical x16 sets, or an x8 slot turns onto two identical x8 pathways. The PCIe "bottlenecking" data you get for two 5870's should therefore be identical to the PCIe data you get from one HD 5870 x2, such as the Asus ARES, which is actually a faster card than the HD 5970.
Thank you for the explanation Crashman
I'm considering using the 4x on my mainboard for airflow reasons in my case and as of now, nothing seems to speek against it. I'm only using a lowly GTX 460 anyway.
The slower your CPU, the more the limit shifts from other components to the CPU. That means the maximum FPS will get dragged down even farther, making the 1680x1050 results look closer to the 1920x1200 results.
Is it a PCIe 2.0 slot? Please read the CrossFire article to see how bad PCIe 1.1 x4 is, and don't use it.