Crossfire makes my PC performance really poor.

GoldenI

Distinguished
Nov 11, 2010
445
0
18,810
Whenever I turn Crossfire on, my FPS goes down quite significantly. I have one Radeon 6850 in the x16 lane, and the other in the x4 lane. 1600x900 is my resolution that I play at on Battlefield 3. What's going on? o_O
 

drewnashty

Distinguished
Dec 15, 2011
12
0
18,510
What driver are you using? I've been reading recently about crossfire issues with Battlefield 3. It seems most people are getting good results with Catalyst™ 11.10 Version 3 Preview Driver. Link below

http://support.amd.com/us/kbarticles/Pages/GPU124AMDCat1110PreDriverV3.aspx

CatalystCreator has said that the 11.9 Catalyst Application Profile (CAP) version 3 is supposed to work well but I've read that others had better results without using any CAP.

I would recommend downloading the driver from above, uninstall your current driver, reboot and then install the 11.10 Version 3 Preview driver. I prefer to download the driver first that way I don't have to deal with slow scrolling and page tearing when browsing to download the driver. It's just really annoying to me.

I hope this helps.
 
It may be driver problem, update your drivers.

If not:

Why do you use x4 with 6850? The PCIE socket is I think bottlenecking your GPU, Try considering upgrade for the motherboard. What's the full system specification? If you are upgrading, what is your budget? I might help you on giving suggestion.
 

drewnashty

Distinguished
Dec 15, 2011
12
0
18,510
He should still experience a boost in performance over a single card. Even from 8x to 16x people typically experience only a 2 percent gain in frames. He could get a new motherboard, but honestly I don't think it's worth the money to get a whole new board just to get squeeze out a few more frames. While the fact of running a second 6850 should help his minimum framerate.

PS - GoldenI, I'm a big fan of Bruce, but my avatar has not yet been approved D:, although i just signed up today even though i've been visiting tomshardware for nearly a decade now.
 

GoldenI

Distinguished
Nov 11, 2010
445
0
18,810

Lol, thanks.

And it's to my understanding that frame rate should only become an issue when I am playing on a much higher resolution. That's when stuttering begins to take effect.

Also, I just installed the new drivers yesterday (11.12).

I'll try 11.10.
 
-_- I wouldn't be surprised if it's simply the fact that your using the 4x slot. It is not ideal for crossfire simply because the 4x slot is run off the P55 chip rather than directly from the CPU which introduces latency plus sharing bandwidth with all those other pesky chipset devices ^_^.
 

drewnashty

Distinguished
Dec 15, 2011
12
0
18,510
Megaman, I can see where your getting at if he is maybe using other PCI E devices such as a sound card or RAID device or what not but here are a few documented tests to show that basically the difference is negligible in graphics performance.

http://www.hardocp.com/article/2010/08/25/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x4x4/1

When you get above resolutions higher than 2560x1600 (higher bandwidth requirements) there is more impact and on games where your frame rates are already high there is a bigger increase in FPS. But obviously if you are running at 110 fps you really won't be missing out anything from a loss of running at 120-130. It's down below 60 fps where the extra boost is desired for a more enjoyable gaming experience.

http://www.overclock.net/t/967533/crossfire-x16-x4-vs-x8-x8-comparison

It shows that there was barely a full frame increase in minimum FPS between x16 + x4 vs x8 + x8 and the average went up about 4.5 fps. I would imagine there would be a slight increase from that when going full x16 + x16 but again is the extra few amount of frames worth how ever much it costs to get a legitmate x16+x16 board? Personally I'm always a budget minded PC builder so I can't justify spending over $100 on a new board to get that little amount of increase in performance.

http://www.tomshardware.com/reviews/pci-express-scaling-p67-chipset-gaming-performance,2887-10.html

http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/7.html

The more I search on this topic the more I find posts with people saying they noticed a very little difference.
 
No no no this case is different. In this case it's like he only has a 2x slot. On P55 boards like this all 16x lanes on the CPU are devoted to the graphics slot providing 16GB/s bandwidth, but the other 4x lanes are connected to the P55 chipset which only has a 2GB/s connection to the CPU which is used not just by that 4x PCI-E 1.0 slot but by all the other functions the chipset provides like audio and networking. Thus it's like he only has a 2x PCI-E 2.0 slot IN THE BEST CASE!!!

http://www.anandtech.com/show/3574

In many of those cases you list above 4x and below are simulated by blocking off the PCI-E lanes, not by using the slower performing 4x lanes on the P55 chipset. As for that bonus page that shows 16x/4x performance it is using the P67 chipset which has a better 20GB/s link to the CPU and the 4x slot is PCI-E 2.0 rather than the PCI-E 1.1 offered by the P55 boards 4x slot.
 
Yeah my Asus mobo has two 16x 2.0 slots that run at 8x/8x in CFX, but it also has that 4x slot which is running off the southbridge and is therefore only a PCIe 1.0 slot. My GT 240 can't even get enough bandwidth to max out, let alone a 6850.
 

drewnashty

Distinguished
Dec 15, 2011
12
0
18,510


I see. Is this an issue mainly with certain Intel chipsets? My first multi-GPU scaling motherboard was the DFI NF4 Lanparty UT Ultra modded to SLI. Considering how old that motherboard is I would think Intel wouldn't have skimped out so bad

I bought my first Intel CPU two years back and the motherboard has a p45 northbridge with ICH10R south. Will the issue of the onboard NIC and sound card be apparent, taking up bandwidth, if I run crossfire on this board. Or do you have a link explaining the bandwidth allocation for the PCI e lanes. I'll be running a 4870x2 in the first slot and a 4870 in the second and the should run x8 + x8. I remember a with every new release of a wider bandwidth graphics port there are always test showing that they still can't make use of the amount of data rate they provide. Like the jump from AGP 4x to 8x there really wasn't much of a difference back in the days of the Geforce 4 Ti, I remember upgrading to a the GF 4 Ti that was AGP 8x and seeing no difference at all.