Sign in with
Sign up | Sign in
Your question

Seriously -- high-res 3D CPU-limited with OC'ed D805??

Last response: in CPUs
Share
July 16, 2006 2:11:04 AM

Something about my new CrossFire setup seems fishy, and I was wondering if anyone out here knows if it's CPU related.

My D805 is OC'ed up to 3.625GHz. I tested my rig with 3DMark05, and with XFire disabled, I got a score of 9075 (CPU score: 5800). When I enabled XFire, the score went up to 9469 (and for some reason the CPU score went up to 6274). I didn't really expect to see such a small increase in overall score, but I thought that since 3DMark is running at a lower resolution of 1024x768, then my score could possibly be CPU-bound.

Curious, I loaded up CS:Source and ran its Video Stress Test at a high resolution (1920x1200) to see if that fixed my CPU-bound problems. With XFire enabled, my rig only managed 103.17fps. With XFire disabled, the fps actually went up to 107.25.

What's going on here? Shouldn't high-res 3D apps be GPU bound? Is a 3.6GHz D805 really that worthless, or do I have a video card problem? Why would disabling XFire raise the fps? All the drivers are clean, soooo... What do you guys think?

Thanks in advance,
Rev
July 16, 2006 3:44:16 AM

Some applications are designed to take advantage of a multi-GPU platform, some aren't. As you can see, 3DMark is optimized for Crossfire/SLI, while CS:S isn't (I guess).

If you want to see CPU-bound performance you have to drop the resolution to 1024x768 or even less, 640x480 would be a good choice.
July 16, 2006 3:58:59 AM

Quote:
Something about my new CrossFire setup seems fishy, and I was wondering if anyone out here knows if it's CPU related.

My D805 is OC'ed up to 3.625GHz. I tested my rig with 3DMark05, and with XFire disabled, I got a score of 9075 (CPU score: 5800). When I enabled XFire, the score went up to 9469 (and for some reason the CPU score went up to 6274). I didn't really expect to see such a small increase in overall score, but I thought that since 3DMark is running at a lower resolution of 1024x768, then my score could possibly be CPU-bound.

Curious, I loaded up CS:Source and ran its Video Stress Test at a high resolution (1920x1200) to see if that fixed my CPU-bound problems. With XFire enabled, my rig only managed 103.17fps. With XFire disabled, the fps actually went up to 107.25.

What's going on here? Shouldn't high-res 3D apps be GPU bound? Is a 3.6GHz D805 really that worthless, or do I have a video card problem? Why would disabling XFire raise the fps? All the drivers are clean, soooo... What do you guys think?

Thanks in advance,
Rev


The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.
July 16, 2006 4:02:00 AM

I see... Though, just to be clear, I'd rather not see CPU-bound performance... So in your opinion, my CPU is fast enough so that CPU bottlenecking shouldn't be an issue at higher resolutions? Can anyone else out there confirm that the CS:S stress test doesn't take advantage of dual video cards?
July 16, 2006 4:04:55 AM

Even though I can't confirm anything about CS:S, I can tell you this: The higher the resolution, the less CPU-Bound gets.
Or: The higher the resolution, the more GPU-Bound gets.
It's the same thing, from two different points of view. :wink:
July 16, 2006 4:06:47 AM

Quote:
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.


I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?
July 16, 2006 4:08:00 AM

Quote:
Even though I can't confirm anything about CS:S, I can tell you this: The higher the resolution, the less CPU-Bound gets.
Or: The higher the resolution, the more GPU-Bound gets.
It's the same thing, from two different points of view. :wink:


Exactly -- Which is why I find this situation so puzzling...
July 16, 2006 4:12:15 AM

Quote:
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.


I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.
July 16, 2006 4:15:47 AM

Quote:
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.


I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.

Interesting... I could potentially push it to 3.8GHz, but I'd be hesitant to try 4.0 on air. Do you think 175MHz would make much of a difference? Would that suddenly free up my GPUs to stretch their legs at high res?
July 16, 2006 4:37:45 AM

Quote:
The CrossFire driver adds overhead to the CPU. You are at the EXTREME limit of P4 dual at CrossFire 1920 while one X1900 will love being at 1920 with that clockspeed.


I'm not sure I follow... In a sense, the CPU is causing the 2 video cards to trip over one another?

yes that's exactly what I mean. I assume you have an X1900Xt and benches show that doing very well in a single card scenario. When you add CrossFire you do add load to the OS and CPU.

Because the CPU needs to do both the fps decreases slightly. If you were to get to 4 GHz, you would see that difference disappear until 2048 when the CPU is overly strained again. The CPU is responsible for a lot in a game even though GPUs do more work.

Interesting... I could potentially push it to 3.8GHz, but I'd be hesitant to try 4.0 on air. Do you think 175MHz would make much of a difference? Would that suddenly free up my GPUs to stretch their legs at high res?


You wouldn't get a huge increase but you MAY not see the difference in CrossFire and single.
1600x1200 would probably behave a little differently. You basically need to see if you can swing Core 2 to see increases since CrossFire is SERIOUS GPU power. Even though people often say you can get a cheap CPU and great GPU you will still see, for example, higher mins on FEAR or more power for multitasking with games with a better CPU.
July 16, 2006 4:42:40 AM

Yup, Core 2 is definitely part of the plans for the near future. Thanks for the pointers -- I guess I can relax and not worry about having a dysfunctional CrossFire card.
July 16, 2006 5:19:19 AM

I've noticed that Source based games with HDR disabled don't perform as well with multiple GPUs. On Half-Life2 my framerate goes down with SLI enabled, but play Half-Life2: Episode 1 my framerate almost doubles.
July 16, 2006 5:34:21 AM

From what I remember from other threads is that Source games don't get a performance increase with crossfire or sli as in most other games, rather they can actually get a performance decrease compared to running a single card.

It seems strange but I think thats right.
July 16, 2006 8:44:37 PM

Update:

I got CoH up and running on my computer. At 1900x1200 with all the graphical bells and whistles enabled, FRAPS shows that regardless of wheter or not CrossFire is enabled or disabled, I'm getting a little over 20 fps.

Is CoH another game that isn't Xfire-friendly? I guess, once again, my CPU is limiting what the GPUs can process?
July 17, 2006 6:12:45 PM

Ok, here's what i don't understand:

According to the Tom's article about overclocking the 805, my current overclock of 3.625GHz should put me about on par with a Pentium EE 955 @ 3.46GHz or an A64 X2 4600+ @ 2.4GHz when it comes to 3D performance.

Here's the link -- check it out.

Is there any chance those processors would limit performance of 2 X1900XTs in XFire at a resolution of 1920x1200??

I guess what I'm saying is that I find it hard to believe this is a CPU issue and not a GPU issue, particularly at that high of a resolution.

Or am I way off base?
July 17, 2006 6:56:45 PM

Alright, here's what MikeMK has to say about CPU-limited XFire over in the 3D Guru forum:

"Just like any multi GPU setup, crossfire is CPU limited to a point, so the faster CPU u have the better. This isnt to say you wont get amazing performance with a 3700+, you will, but of course you would get even better performance with an FX. As i said above, to avoid the bottleneck the best thing to do is to run high res, loads of AA and AF. Thats wot crossfire is designed for. If you are running at a res lower than 1600x1200, there isnt much point in going for a dual GPU setup. A single XTX can handle every game at that resolution."

I'm doing everything he says to avoid the CPU bottleneck.

Should I post this problem over in the video cards section?

Thanks, guys,
Rev
July 17, 2006 7:54:59 PM

Quote:
well don't ahve a solution to your problems but thought i'd say that with one 1900xt i get 104/5FPS in the CS:S stress test and about 55-60FPS in the HL2:LC stress test at 1920 x 1200. in 3dmark06 i get around 5400.

i would check driver installations and also try to compae scores online at futurmarks site using the ORB facility i think its called.


Thanks for the reply.

Looks like we're getting the about the same CS:S score, even with my XFire enabled, which is probably not a good sign.

I don't have 3DMark06, but I compared my 05 score at ORB, and a guy running a Pentium D @ 3750MHz (or 125MHz faster than mine) with a 975X mobo and the same graphics setup scored 11197, or roughly 1600 above mine. His CPU score was 6920, or about 650 above mine.

I guess that seems reasonable if 3DMark05 is CPU-limited, but the fact that I'm not seeing improved performance at higher resolutions really troubles me.

Drivers have been reinstalled several times -- I even went back to the 6.5 version to see if that would help. It didn't.

Maybe I have a defective XFire dongle?

Thanks again.
July 17, 2006 9:37:55 PM

I'm not meaning to insult you but, by any chance did you buy 2 X1900xt's and put them in crossfire or did you buy one crossfire master card and the second card a regular X1900xt?
July 17, 2006 9:53:27 PM

Could the problem be that my graphics cards are sharing the same # IRQ?

I pulled this out of my system info:

IRQ 0 System timer OK
IRQ 4 Communications Port (COM1) OK
IRQ 6 Standard floppy disk controller OK
IRQ 8 System CMOS/real time clock OK
IRQ 9 Microsoft ACPI-Compliant System OK
IRQ 11 Intel(R) 82801G (ICH7 Family) SMBus Controller - 27DA OK
IRQ 13 Numeric data processor OK
IRQ 14 Primary IDE Channel OK
IRQ 16 Intel(R) 975X PCI Express Root Port – 277D OK
IRQ 16 Radeon X1900 CrossFire Edition OK
IRQ 16 Intel(R) 975X PCI Express Root Port – 277A OK
IRQ 16 Radeon X1900 Series OK
IRQ 16 Intel(R) 82801GR/GH/GHM (ICH7 Family) PCI Express Root Port - 27E2 OK
IRQ 16 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27CB OK
IRQ 17 Intel(R) 82801G (ICH7 Family) PCI Express Root Port - 27D0 OK
IRQ 17 Intel(R) 82801GR/GH/GHM (ICH7 Family) PCI Express Root Port - 27E0 OK
IRQ 17 Intel(R) PRO/1000 PL Network Connection OK
IRQ 17 Silicon Image SiI 3114 SoftRaid 5 Controller OK
IRQ 18 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27CA OK
IRQ 18 Creative SB X-Fi OK
IRQ 18 Texas Instruments OHCI Compliant IEEE 1394 Host Controller OK
IRQ 19 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27C9 OK
IRQ 19 Intel(R) 82801GB/GR/GH (ICH7 Family) Serial ATA Storage Controller - 27C0 OK
IRQ 23 Intel(R) 82801G (ICH7 Family) USB Universal Host Controller - 27C8 OK
IRQ 23 Intel(R) 82801G (ICH7 Family) USB2 Enhanced Host Controller - 27CC OK

I really don't know much about IRQs, but I've seen people who have problems with X-Fi cards sharing IRQs with the slave card...

Just throwing it out there...
July 17, 2006 9:54:22 PM

Quote:
he wouldn't have a dongle if he didn't buy a master card.


Right.
July 17, 2006 9:55:49 PM

You could try disableing some of the onboard usb ports if your not using them and assigne one of the cards to that irq.

It might work but it might not.

I dont have experience with IRQ's tho.
July 17, 2006 10:14:02 PM

Quote:
no i doubt that would be you problem as alot of crossfire and sli boards only have 16 lanes for the gfx cards to share. so they each get the equivalent of 8xpci-e. newer ones have 32 so they get full 16x bandwidth. AFAIK they do not suffer from using 8x pci-e instead of 16xpci-e.

i could also be wrong but irq confilcts don't happen on a pci-e e bus. that might be wonrg though but i think it is only on pci buses.


Yeah, I figured that was probably the case... I'm just sort of grasping at straws at this point.

It doesn't help that the ATi support guys are jerking me around... One actually hung up on me.

This is frustrating...
July 17, 2006 10:19:35 PM

Quote:
I don't have 3DMark06

I'd download it and then report back with your numbers.

-mcg
July 17, 2006 11:11:02 PM

One quick question: why did you get the Pentium D 805 when for only another $50 you could have had the Pentium D 930? :? Looking at the rest of your rig, and it's nice, so don't tell me "I didn't have enough money." :wink:
July 18, 2006 12:09:42 AM

MCG: I'm downloading 3DMark06 as we speak. It should be done in about 4 hours... Yes, my internet sucks.

SS: I haven't tried switching the cards around since Intel is pretty specific about installing the master card in the top slot... I did however, take the master out and put the slave in its place to see if the slave works by itself. It does. Thanks for the suggestions, though.

HY27: 2 reasons
1.) I plan on upgrading to Core 2 as soon as I can afford it -- if I spend more $$$ now, that's longer I have to wait for it.
2.) THG (and others) said a properly OC'ed 805 would outperform pretty much any chip on the market, so why would I want to spend more than I had to?

Thanks for the responses, guys. Keep 'em coming.
July 18, 2006 2:50:37 AM

Quote:
MCG: I'm downloading 3DMark06 as we speak. It should be done in about 4 hours... Yes, my internet sucks.


1 hour to go... 74% done.
July 18, 2006 6:19:43 AM

Ok, well after running tests in 3DMark06, I'm feeling a bit more optimistic about my situation.

With XFire disabled, I scored 5546 (1742 CPU). When I enabled XFire and ran the program, I got a strange video problem where the screen flickered and it was as if there was a double-image, one on top of the other, with one of the images perpetually crawling up the screen. I let the program run its course, and at the end it came out with a score of 7696 (1719 CPU).

The score looked good, but the image did not, so I rebooted and tried again. This time I got a 7715 (1723 CPU) without any video problems.

Checking ORB, both the disabled and enabled scores seem to fit with what other people are getting with similar rigs...

Now I just wonder why it's not translating into real-world gaming performance. I suppose I should do more testing?

What would you guys recommend?
July 18, 2006 6:55:09 AM

Well, I don't know what happened... Maybe running 3DMark06 gave my video cards the kick in the butt they needed... But for some reason, the CS:S benchmark I ran before that yielded 103 fps before now gives me over 152 fps with XFire enabled.

Stunned, I ran 3DMark05 once again and got a 10366 -- significantly higher than the 9400ish I was getting earlier...

What the heck?? I guess I'm happy... we'll see how real games do...

Problem solved?
July 18, 2006 3:15:09 PM

Yeah, I don't know what it could have been... I haven't reseated the cards since early in the troubleshooting stages, so it couldn't have been that... But anyways, I'll run some real-world in-game tests today using FRAPS and let you guys know if this really is some sort of computer miracle.

Thanks for all the help,
Rev
July 18, 2006 3:46:37 PM

I just noticed you have an Intel mobo. Maybe it does have some problems with Crossfire, even it officially supports it.
July 18, 2006 4:40:35 PM

Quote:
I just noticed you have an Intel mobo. Maybe it does have some problems with Crossfire, even it officially supports it.


Why would that be?
July 18, 2006 5:20:07 PM

Ummm, I don't know. I said "maybe". :wink:

The logical thing to do is: Buy a Crossfire mobo to do Crossfire. (Or buy a SLI mobo to do SLI). :tongue:

If you can, get a Crossfire mobo (ATI chipset) to do some testing. :idea:
July 18, 2006 5:27:45 PM

I guess I don't follow you. The 975X Intel chipset is a CrossFire chipset.
July 18, 2006 5:34:42 PM

No, it's an Intel chipset that supports Crossfire.
It's not an ATI chipset that natively supports Crossfire, and it's designed to do Crossfire.
This is a proper Crossfire mobo, because it has an ATI chipset.
July 18, 2006 5:43:17 PM

I'm sorry, but I think I have a "proper" XFire mobo. Look at the picture on this page, or read up on the 975x chipset.

You'll have to explain to me the nuanced difference between a board supporting XFire and natively supporting XFire... I guess I still don't get it.
July 18, 2006 6:14:09 PM

To illustrate my point even further, look at this page.

ATi says I have a CrossFire-certified motherboard.

What difference does it make in regard to who makes the chipset?
July 18, 2006 6:21:06 PM

I found this on another thread that could be useful:

Quote:
Heck start at the beginning of the xFire portion and read through them all:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...
HL2, BF2, Q4 -- all show a domineering position of C2D @ 1600x1200 full aliasing. How can this be explained? Simple, xFire puts the burden back on the CPU.... and since C2D demolishes FX-62, is it surprising that it now out performs.... say you buy the next gen GPU around xMAS, do you want a CPU throttled rig to run it? That is what you would evidently get with an AMD powered system. Did HardOCP explain this??? Nope.


About the matter of a "proper" mobo... Mmm, let me see...
I'll put it like this:
Let's say you buy a Ford Focus RS to compete in rally championships. That car can handle it, it has the necessary power and technology, but it's not designed for it.
What you should buy is a Ford Focus WRC, completely designed and structured to compete in rally championships.
Got it?
The same thing if you want to use a Victorinox to slice your bread. :wink:
July 18, 2006 6:41:43 PM

Thanks for the link.

Also, I guess I see your point about the Intel v. ATi chipsets, but at the same time, I don't think CrossFire support is something that was simply slapped onto 975x as an afterthought. And, hey, when I was traveling through Europe last month, I used my Swiss Army knife to slice bread all the time. Worked great. :wink:
July 18, 2006 7:04:19 PM

Quote:
And, hey, when I was traveling through Europe last month, I used my Swiss Army knife to slice bread all the time. Worked great. :wink:


:lol:  :lol:  :lol:  :lol: 
July 18, 2006 10:21:23 PM

Well, after running several tests, I don't think we can call this a computer miracle yet. I'm getting several bizarre results.

I ran tests in 3DMark06, 3DMark05, 3DMark03, PCMark05, CS:S Video Test, HL2:LC Video Test, Ultimate Spider-Man, FIFA '06, and City of Heroes. The Futuremark tests were all the standard free versions, CS:S and HL2:LC were run at 1900x1200. FPS for USM, FIFA, and CoH were all measured with FRAPS. USM and CoH were run at 1900x1200, and FIFA was run at 1600x1200. All 3D settings in Catalyst Control Center were set to "Let application decide."

Here are the results:

XFire enabled
3DMark06: 7815 (1782 CPU)
3DMark05: 10365 (6095 CPU)
3DMark03: 29759 (1035 CPU)
PCMark05: 6048
CS:S VST: 151.23 fps
HL2:LC VST: 98.06 fps
USM: 8 min, 31 max, 28.997 fps
FIFA: 58, 61, 59.961 fps
CoH: 13, 40, 24.872 fps

XFire disabled
3DMark06: 5528 (1714)
3DMark05: 9488 (6140)
3DMark03: 17889 (1027)
PCMark05: 6099
CS:S VST: 106.78 fps
HL2:LC VST: 97.31 fps
USM: 8 min, 32 max, 29.129 fps
FIFA: 58, 61, 59.879 fps
CoH: 12, 39, 23.863 fps

First of all, the things that look ok:

There were large differences between 3DMark06, 3DMark03, and CS:S. That's what I would expect.

Now the things that don't look ok:

There's isn't a very large difference between the 3DMark05 scores (< 1000 marks). I would say that this is probably because the low-resolution of the test is making it CPU-limited, but the 3DMark03 test run at the same resolution shows a vast difference when XFire is enabled or disabled.

The HL2:LC test scores are nearly identical -- weird.

In the real games, I think USM and FIFA may be capping framerates and 30 and 60, respectively, but the CoH result, much like the HL2:LC result, makes no sense. Why wouldn't a game benefit from a second GPU with the resolution cranked up that high?

I feel like we're getting closer, but I'm still not sure what's going on here. What do you guys think?

Thanks,
Rev
July 18, 2006 10:29:17 PM

Try disabling V-Sync if you haven't already. Sometimes it worth the try.
July 18, 2006 10:35:38 PM

Quote:
i think their is something not right with your crossfire in that wether it is on or off.

i mean your system ain't that much different than mine at least when C/F is off but you got 40FPS more in the HL2:LC stress test. crossfire would have to be enabled to get that. i am assuming you have HDR on. for the fifa and USM go into the CCC and make sure v-sync is set to off completely. also what is your mipmap qaulity set to. do you use HQAF or adaptive AA?


Let's see... v-sync is set to "off, unless app specifies." I can change that... High quality mipmap... HQAF and adaptive AA boxes both checked...

Is it time to try re-downloading CCC and drivers to see what happens?
July 18, 2006 10:39:02 PM

OK, V-Synch off completely, but still can't break 30fps on USM, even at lower resolutions.
July 18, 2006 10:44:26 PM

Are you referring to HDR in the HL2:LC test?
July 18, 2006 10:48:26 PM

The difference between 102-107 fps is so small its only normal to see... That's like what 3% change? Nothing to be overally concerned about.

Although yes it is true that the higher you cale the resolution the more GPU bound you become. This also works the other way around. As you scale back down towards lower res you'll notice a cpu bound trend. Regardless of the setup.

Where you'll notice the most signif improvement with an xfire setup will be with higher res aa/ansi. Turn it all on, run some benchs. Compare to em w/o the settings enable. But also make sure Vsync is disabled. This'll cap your max frames at the current monitor refresh too... I know it sounds redundant but you'd be surprised how many people wonder why their cap'd at 60 or 85 fps or even 100... only to see vysnc = enabled.
July 18, 2006 10:52:15 PM

Quote:
hmmm, is that game related to C&C generals. the exact same thing happens in that. I think the game itself puts a 30FPS cap on it. of course with generals it also has a habit of slowing to a crawl no matter what comp i play on.

EDIT: yes the HDR in the lost coast demo.


OK, HDR was diabled when I tested. I set it to full with XFire disabled and ran it again. Got about the same result -- 96.17 fps. I'll try enabling XFire and running it again.
July 18, 2006 10:55:38 PM

Quote:
hmmm, is that game related to C&C generals. the exact same thing happens in that. I think the game itself puts a 30FPS cap on it. of course with generals it also has a habit of slowing to a crawl no matter what comp i play on.

EDIT: yes the HDR in the lost coast demo.


OK, HDR was diabled when I tested. I set it to full with XFire disabled and ran it again. Got about the same result -- 96.17 fps. I'll try enabling XFire and running it again.

Ijn all honesty, sorry replied at the earlier start of the thread and came back later to post and found it to be very unuseful for now but I'd check for driver related issues.

If you're running CC for ATi and Ati tray tools you'll run into issues as well. Best bets to make sure everythings up to date and start in a slow progression forward for a solution. The biggest thing is this. If you can run a game max settings *or by choice or what current tech allows.. obviously no ones running oblivion max'd* and it's at respectable frames... then alls well.

If you're looking for the sheer factor of numbers which 05'06s is great for, I'd check the bios first. Specific settings can really hamper over all system performance. If you notice booting ups a little slower with tighter latencies give it a small boost to voltage. Also can help on stability.

Also, make sure that the correct profiles setup for each individual game support. I know that Atis a bit more finicky about what apps can run what xfired.
July 18, 2006 11:03:08 PM

With XFire and HDR enabled, my score on the HL2:LC test drops to 93.86 fps. Seems strange.

hellcatjr: Are you saying I should tighten my memory timings? Also, what's this about XFire profiles for individual games?
July 18, 2006 11:18:13 PM

Try something that is

a) heavily GPU bound

b) is proven to get a substantial benefit from using crossfire

Its called Oblivion :p 
July 18, 2006 11:22:00 PM

Quote:
Try something that is

a) heavily GPU bound

b) is proven to get a substantial benefit from using crossfire

Its called Oblivion :p 


Can you loan me 50 bucks? But seriously, any game released in the last year or two should be fairly GPU-bound when played at resolution as high as 1920x1200, right?? 24 fps in CoH, which is a fairly graphically-intense game in its own right, just seems totally unacceptable.
!