Seriously -- high-res 3D CPU-limited with OC'ed D805??

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Something about my new CrossFire setup seems fishy, and I was wondering if anyone out here knows if it's CPU related.

My D805 is OC'ed up to 3.625GHz. I tested my rig with 3DMark05, and with XFire disabled, I got a score of 9075 (CPU score: 5800). When I enabled XFire, the score went up to 9469 (and for some reason the CPU score went up to 6274). I didn't really expect to see such a small increase in overall score, but I thought that since 3DMark is running at a lower resolution of 1024x768, then my score could possibly be CPU-bound.

Curious, I loaded up CS:Source and ran its Video Stress Test at a high resolution (1920x1200) to see if that fixed my CPU-bound problems. With XFire enabled, my rig only managed 103.17fps. With XFire disabled, the fps actually went up to 107.25.

What's going on here? Shouldn't high-res 3D apps be GPU bound? Is a 3.6GHz D805 really that worthless, or do I have a video card problem? Why would disabling XFire raise the fps? All the drivers are clean, soooo... What do you guys think?

Thanks in advance,
Rev
 

Multiplectic

Distinguished
Apr 17, 2006
1,029
0
19,280
Some applications are designed to take advantage of a multi-GPU platform, some aren't. As you can see, 3DMark is optimized for Crossfire/SLI, while CS:S isn't (I guess).

If you want to see CPU-bound performance you have to drop the resolution to 1024x768 or even less, 640x480 would be a good choice.
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
Something about my new CrossFire setup seems fishy, and I was wondering if anyone out here knows if it's CPU related.

My D805 is OC'ed up to 3.625GHz. I tested my rig with 3DMark05, and with XFire disabled, I got a score of 9075 (CPU score: 5800). When I enabled XFire, the score went up to 9469 (and for some reason the CPU score went up to 6274). I didn't really expect to see such a small increase in overall score, but I thought that since 3DMark is running at a lower resolution of 1024x768, then my score could possibly be CPU-bound.

Curious, I loaded up CS:Source and ran its Video Stress Test at a high resolution (1920x1200) to see if that fixed my CPU-bound problems. With XFire enabled, my rig only managed 103.17fps. With XFire disabled, the fps actually went up to 107.25.

What's going on here? Shouldn't high-res 3D apps be GPU bound? Is a 3.6GHz D805 really that worthless, or do I have a video card problem? Why would disabling XFire raise the fps? All the drivers are clean, soooo... What do you guys think?

Thanks in advance,
Rev

The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
I see... Though, just to be clear, I'd rather not see CPU-bound performance... So in your opinion, my CPU is fast enough so that CPU bottlenecking shouldn't be an issue at higher resolutions? Can anyone else out there confirm that the CS:S stress test doesn't take advantage of dual video cards?
 

Multiplectic

Distinguished
Apr 17, 2006
1,029
0
19,280
Even though I can't confirm anything about CS:S, I can tell you this: The higher the resolution, the less CPU-Bound gets.
Or: The higher the resolution, the more GPU-Bound gets.
It's the same thing, from two different points of view. :wink:
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.

I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Even though I can't confirm anything about CS:S, I can tell you this: The higher the resolution, the less CPU-Bound gets.
Or: The higher the resolution, the more GPU-Bound gets.
It's the same thing, from two different points of view. :wink:

Exactly -- Which is why I find this situation so puzzling...
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.

I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.

I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.

Interesting... I could potentially push it to 3.8GHz, but I'd be hesitant to try 4.0 on air. Do you think 175MHz would make much of a difference? Would that suddenly free up my GPUs to stretch their legs at high res?
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.

I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.

Interesting... I could potentially push it to 3.8GHz, but I'd be hesitant to try 4.0 on air. Do you think 175MHz would make much of a difference? Would that suddenly free up my GPUs to stretch their legs at high res?


You wouldn't get a huge increase but you MAY not see the difference in CrossFire and single.
1600x1200 would probably behave a little differently. You basically need to see if you can swing Core 2 to see increases since CrossFire is SERIOUS GPU power. Even though people often say you can get a cheap CPU and great GPU you will still see, for example, higher mins on FEAR or more power for multitasking with games with a better CPU.
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Yup, Core 2 is definitely part of the plans for the near future. Thanks for the pointers -- I guess I can relax and not worry about having a dysfunctional CrossFire card.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I've noticed that Source based games with HDR disabled don't perform as well with multiple GPUs. On Half-Life2 my framerate goes down with SLI enabled, but play Half-Life2: Episode 1 my framerate almost doubles.
 

Nitro350Z

Distinguished
Apr 19, 2006
416
0
18,780
From what I remember from other threads is that Source games don't get a performance increase with crossfire or sli as in most other games, rather they can actually get a performance decrease compared to running a single card.

It seems strange but I think thats right.
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Update:

I got CoH up and running on my computer. At 1900x1200 with all the graphical bells and whistles enabled, FRAPS shows that regardless of wheter or not CrossFire is enabled or disabled, I'm getting a little over 20 fps.

Is CoH another game that isn't Xfire-friendly? I guess, once again, my CPU is limiting what the GPUs can process?
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Ok, here's what i don't understand:

According to the Tom's article about overclocking the 805, my current overclock of 3.625GHz should put me about on par with a Pentium EE 955 @ 3.46GHz or an A64 X2 4600+ @ 2.4GHz when it comes to 3D performance.

Here's the link -- check it out.

Is there any chance those processors would limit performance of 2 X1900XTs in XFire at a resolution of 1920x1200??

I guess what I'm saying is that I find it hard to believe this is a CPU issue and not a GPU issue, particularly at that high of a resolution.

Or am I way off base?
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Alright, here's what MikeMK has to say about CPU-limited XFire over in the 3D Guru forum:

"Just like any multi GPU setup, crossfire is CPU limited to a point, so the faster CPU u have the better. This isnt to say you wont get amazing performance with a 3700+, you will, but of course you would get even better performance with an FX. As i said above, to avoid the bottleneck the best thing to do is to run high res, loads of AA and AF. Thats wot crossfire is designed for. If you are running at a res lower than 1600x1200, there isnt much point in going for a dual GPU setup. A single XTX can handle every game at that resolution."

I'm doing everything he says to avoid the CPU bottleneck.

Should I post this problem over in the video cards section?

Thanks, guys,
Rev
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
well don't ahve a solution to your problems but thought i'd say that with one 1900xt i get 104/5FPS in the CS:S stress test and about 55-60FPS in the HL2:LC stress test at 1920 x 1200. in 3dmark06 i get around 5400.

i would check driver installations and also try to compae scores online at futurmarks site using the ORB facility i think its called.

Thanks for the reply.

Looks like we're getting the about the same CS:S score, even with my XFire enabled, which is probably not a good sign.

I don't have 3DMark06, but I compared my 05 score at ORB, and a guy running a Pentium D @ 3750MHz (or 125MHz faster than mine) with a 975X mobo and the same graphics setup scored 11197, or roughly 1600 above mine. His CPU score was 6920, or about 650 above mine.

I guess that seems reasonable if 3DMark05 is CPU-limited, but the fact that I'm not seeing improved performance at higher resolutions really troubles me.

Drivers have been reinstalled several times -- I even went back to the 6.5 version to see if that would help. It didn't.

Maybe I have a defective XFire dongle?

Thanks again.
 

Nitro350Z

Distinguished
Apr 19, 2006
416
0
18,780
I'm not meaning to insult you but, by any chance did you buy 2 X1900xt's and put them in crossfire or did you buy one crossfire master card and the second card a regular X1900xt?
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
Could the problem be that my graphics cards are sharing the same # IRQ?

I pulled this out of my system info:

IRQ 0 System timer OK
IRQ 4 Communications Port (COM1) OK
IRQ 6 Standard floppy disk controller OK
IRQ 8 System CMOS/real time clock OK
IRQ 9 Microsoft ACPI-Compliant System OK
IRQ 11 Intel(R) 82801G (ICH7 Family) SMBus Controller - 27DA OK
IRQ 13 Numeric data processor OK
IRQ 14 Primary IDE Channel OK
IRQ 16 Intel(R) 975X PCI Express Root Port – 277D OK
IRQ 16 Radeon X1900 CrossFire Edition OK
IRQ 16 Intel(R) 975X PCI Express Root Port – 277A OK
IRQ 16 Radeon X1900 Series OK
IRQ 16 Intel(R) 82801GR/GH/GHM (ICH7 Family) PCI Express Root Port - 27E2 OK
IRQ 16 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27CB OK
IRQ 17 Intel(R) 82801G (ICH7 Family) PCI Express Root Port - 27D0 OK
IRQ 17 Intel(R) 82801GR/GH/GHM (ICH7 Family) PCI Express Root Port - 27E0 OK
IRQ 17 Intel(R) PRO/1000 PL Network Connection OK
IRQ 17 Silicon Image SiI 3114 SoftRaid 5 Controller OK
IRQ 18 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27CA OK
IRQ 18 Creative SB X-Fi OK
IRQ 18 Texas Instruments OHCI Compliant IEEE 1394 Host Controller OK
IRQ 19 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27C9 OK
IRQ 19 Intel(R) 82801GB/GR/GH (ICH7 Family) Serial ATA Storage Controller - 27C0 OK
IRQ 23 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27C8 OK
IRQ 23 Intel(R) 82801G (ICH7 Family) USB2 Enhanced Host Controller - 27CC OK

I really don't know much about IRQs, but I've seen people who have problems with X-Fi cards sharing IRQs with the slave card...

Just throwing it out there...
 

Nitro350Z

Distinguished
Apr 19, 2006
416
0
18,780
You could try disableing some of the onboard usb ports if your not using them and assigne one of the cards to that irq.

It might work but it might not.

I dont have experience with IRQ's tho.
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
no i doubt that would be you problem as alot of crossfire and sli boards only have 16 lanes for the gfx cards to share. so they each get the equivalent of 8xpci-e. newer ones have 32 so they get full 16x bandwidth. AFAIK they do not suffer from using 8x pci-e instead of 16xpci-e.

i could also be wrong but irq confilcts don't happen on a pci-e e bus. that might be wonrg though but i think it is only on pci buses.

Yeah, I figured that was probably the case... I'm just sort of grasping at straws at this point.

It doesn't help that the ATi support guys are jerking me around... One actually hung up on me.

This is frustrating...
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
One quick question: why did you get the Pentium D 805 when for only another $50 you could have had the Pentium D 930? :? Looking at the rest of your rig, and it's nice, so don't tell me "I didn't have enough money." :wink:
 

The_Rev

Distinguished
Apr 30, 2006
222
0
18,680
MCG: I'm downloading 3DMark06 as we speak. It should be done in about 4 hours... Yes, my internet sucks.

SS: I haven't tried switching the cards around since Intel is pretty specific about installing the master card in the top slot... I did however, take the master out and put the slave in its place to see if the slave works by itself. It does. Thanks for the suggestions, though.

HY27: 2 reasons
1.) I plan on upgrading to Core 2 as soon as I can afford it -- if I spend more $$$ now, that's longer I have to wait for it.
2.) THG (and others) said a properly OC'ed 805 would outperform pretty much any chip on the market, so why would I want to spend more than I had to?

Thanks for the responses, guys. Keep 'em coming.