Sign in with
Sign up | Sign in
Your question

Quad GPU - Strike two?

Last response: in Graphics & Displays
Share
March 7, 2008 11:29:17 PM

http://www.tweaktown.com/articles/1319/radeon_hd_3870_x...

Pretty much nothing changed besides 3DMark06 scores. Is it just ATI drivers or will general quad GPU configuration fail yet again?

More about : quad gpu strike

a b U Graphics card
March 7, 2008 11:36:15 PM

Well being as thats an old driver, who knows? Maybe a waste of time, the article that is "Drivers: Catalyst 8.2, 2-22-08 dated CrossfireX driver"
March 7, 2008 11:46:13 PM

Screw 3Dmark, the real deal is the important thing. And so far, I believe it's just the drivers.
Related resources
March 7, 2008 11:51:53 PM

If you havent noticed two cards doesnt give ahrdly any boost over 1 either, infact in crysis its lower, theyve cocked something up but yeh most liekly drivers.


Just shows you that people want to cut down quad gpus that they miss the obvious. 8.3 is out now anyway.
March 8, 2008 1:02:27 AM

Ok, they proved that they could use drivers that aren't up to date. Probably that's because of the difference between the time the article was undertaken and when it came out. To do it right, the article should be repeated with the latest drivers.

More interesting that I saw was in the "Final Thoughts", the statement "We're going to start running into CPU limitations again with this kind of setup. We can see this with the small differences being seen between 1280 x 1024 and 1920 x 1200". Or a different way of putting it is that they downed the card as giving only a small performance gain when the real problem was elsewhere, the CPU. That hints for the need of a better CPU, as well as better drivers before doing a retest.
a b U Graphics card
March 8, 2008 2:43:02 AM

Maybe its their way of pushing for better test gear heheh
March 8, 2008 2:49:32 AM

Could be. I can just see them saying "Hey, Intel, we need something newer than this year and a half old CPU. Got any new Yorkies you'd like to toss our way?"
a b U Graphics card
March 8, 2008 3:05:11 AM

And they bashed the competition at the same time.... Hmmmm
March 8, 2008 3:14:37 AM

If I was someone willing to unload a crazy amount of cash at that sort of thing... I'd cross my fingers for a 4870 X2, because by then the drivers should be worked out and I think you'd be at the top of the performance pile till at least November, maybe into 09'
March 8, 2008 3:23:07 AM

JAYDEEJOHN said:
And they bashed the competition at the same time.... Hmmmm


Yes, suspicious that. :whistle: 
March 8, 2008 4:15:15 AM

There are certainly a lot of mixed signals though.

You read reviews like this with old drivers and think the thing is great. http://firingsquad.com/hardware/amd_radeon_hd_3870_x2_p...

Then you see the new reviews.
Here it does okay: http://firingsquad.com/hardware/amd_crossfire_x_hybrid_...

Yet here it's really given a bad vibe: http://enthusiast.hardocp.com/article.html?art=MTQ3MCw0...

What's a potential buyer to think? Considering that different rigs will give different results it's not like everyone can blow $450 and hope for the best.

Having to put this much faith on "the drivers just need to be polished" seems like a risky angle.

I've been an ATI buyer for a long time, but it's hard to really cheer them on when it looks like Nvidia has all the bases covered.

I'm still half-tempted to just get the 3870 X2 and eventually add a 3870 in CFX (3-way) and hope for the best. However, my gut keeps telling me to just get a 8800GTS and be done with it. Will be cheaper even if it's not the "potentially" more powerful X2.

(Looking at X38 Maximus Formula for eventual 1920x1200 DX10 gaming :p  / Currently 1280x1024)
a b U Graphics card
March 8, 2008 4:28:53 AM

But releasing benches on new tech, shouldnt you have the best/correct layout? Itd be like if Intel let out a cpu before any of the errata were dealt with, and someone posts a review about how it doesnt perform right. This IS new tech, especially for ATI. If you honestly dont expect improvements from this, then what would it take? I agree, dumping alot of money on something youre not sure about I understand, but for these reviewers making such broad statements about a newer tech that hasnt even had a chance to mature, and not even a standardized driver, well, to me is absurd. At (H) they make a halfway fair assessment out of the cards(s), but end the end they surprise us with, its a day late dollar short, when the competition doesnt even have anything out thAts comparable. Makes me wonder.... This new tech is for the pioneers with a fat wallet, which most have, or spend most of their free money on our hobby. Something new is always around the corner, no garauntees etc. I for 1 will let them explore this area, and as for everyone else that knocks it, or makes general negative statements, Ill either ignore or lambast if theyre a write/reviewer or telling FUD to others here. Anyways, I know what you mean, but its new, we need more time
March 8, 2008 4:46:34 AM

Which would be fine if it was only the 3870 X2, I've seen the same problems with just the single 3870 by itself. Is that also still considered new tech? These cards don't seem to do well in many DX10 games. Lost Planet being one that stands out the most. That's kinda funny as I read that they are built moreso for DX10 and are "ahead" of the game with 10.1. There seems to be a disconnect between the marketing department and the actual performance.

I think this clearly shows it is the drivers. Each game has to be optimized on AMDs end it seems. I used to think it was the game developers that did that part. I guess (if I bought and ATI card) I would just have to hope that AMD eventually gets to whatever game I'm playing at the time to optimize it. If it they don't then "shrug" oh well!

I'll admit that I've been out of the loop and my last ATI card was an X800, which was a great card at the time to me.

Unfortunately, I need to build a new machine now due to certain personal circumstances. So I only have what is currently available to choose from. Of course I don't want to just throw any old part in the box. I don't seem to remember there being this much of a split in performance from one game to another back with my older card. Like you say, though, this is new technology.

I'm sure I'm not the only one that is a little put off (confused) by these mixed reviews.

I am sort of playing devils advocate here as I really want to like ATI's current offering. But with the AMD take over, all the games with the Nvidia logo etc it's a battle between my brain and my gut it seems. Haha, sounds silly, I know.
March 8, 2008 5:43:00 AM

Morbius said:

What's a potential buyer to think? Considering that different rigs will give different results it's not like everyone can blow $450 and hope for the best.

Having to put this much faith on "the drivers just need to be polished" seems like a risky angle.

I've been an ATI buyer for a long time, but it's hard to really cheer them on when it looks like Nvidia has all the bases covered.

I'm still half-tempted to just get the 3870 X2 and eventually add a 3870 in CFX (3-way) and hope for the best. However, my gut keeps telling me to just get a 8800GTS and be done with it. Will be cheaper even if it's not the "potentially" more powerful X2.



We'll see if Nvidia has all the bases covered when the 9800gx2 arrives. From what I've read, triple SLI has more issues than CrossfireX. Both companies have had driver issues and both have gotten their act together down the line. It's a myth that Nvidia never needs driver improvements or that driver improvements don't help ATI.

The thing that I want to do later on is get a 790 board, a Phenom 9750 and a 4850 (that's also clocked at 850) for CrossfireX alongside my 3870x2.

The core clock in CCC Overdrive for both GPU's is 850 (and I bumped it up to 860 safely). The memory clock is 901 (but MSI markets it as 1800 :lol:  see the link: http://www.msicomputer.com/product/p_spec.asp?model=R38... ), and I bumped it up to 931. Yet, the best I can get in 3DMark06 is 9547, and the core clock is seen as 421! Are they dividing the 850 in half?

The card is also a "deactivated item" at Newegg: http://www.newegg.com/product/product.aspx?Item=N82E168... I ordered it on February 1st, it went up to $499 a few days later and I'd thought I'd gotten a good deal, then it disappeared. That doesn't sound good. At least mine hasn't toasted with heavy gameplay this past month. Once I got the right PSU, it runs at 67C and has never gone above 76C while testing under CCC. Still, there was that 421 reported clock core in 3DMark06.

So, I downloaded GPU-Z and took a look. It shows that Crossfire is enabled (I"m using Catalyst 8.3) but both GPU's are listed as clocked at 421, with the default for the first as 431 and the default for the second as 850. That makes me think that I'm not just CPU limited at my 1024 x 768 CRT resolution, but that one GPU clocks too low.

If the 850 GPU slows to 431 to match the 2nd, then it's like I paid $449 for Crossfire performance that really only equals one 3870. Do I have a bad 3870x2? Perhaps someone who knows GPU-Z and has a 3870x2 can help me identify the problem.

Someone said that GPU-Z might be reading 2D clocks, but CCC shows that as 300 for each core, so I don't think that's it. Adding 3DMark's reading of the clocks makes me want to RMA it to MSI. I'll e-mail their tech support about it tomorrow. Another thing is Auto-Tune always freezes, though I can manually set clocks that pass.

I'm getting decent performance in The Witcher and Oblivion, but low performance in LOTR online. Once I get a 20" LCD next week, then I'll be able to compare my FRAPS benchies with those on numerous review sites a bit more directly. I'm actually inclined to not RMA it until June when I can get that 4850. Then, I won't be stuck with just the 690V IGP for a few weeks.

Anyone have any similar expereinces like this? Tomorrow, when I'm home, I can post the actual screenshots of GPU-Z and 3DMark06, but right now I'm at work and Photobucket's blocked.
March 8, 2008 6:32:38 AM

I'd say that's a very low score. I've seen that card hit 13k on a lot of sites. The 3870 X2 seems to do really well with 3Dmark. Not always so much in real games depending on where you read.
March 8, 2008 8:07:26 AM

Morbius said:
I'd say that's a very low score. I've seen that card hit 13k on a lot of sites. The 3870 X2 seems to do really well with 3Dmark. Not always so much in real games depending on where you read.


I know it's a low score. I'd thought I was CPU limited, but what if both GPU's are in Crossfire mode with clocks @421 instead of 850? I don't feel happy paying $449 for the Crossfire performance of a couple of 3650's. Is one GPU defective, or is GPU-Z and 3DMark06 misreading the GPU clock? CCC reports both as 860.

Anyone seen this kind of reporting with GPU-Z and 3DMark before? One GPU on the card @ 421 for actual clock with 421 as the default; and the other @ 421 for actual clock with 850 as the default. That seems very strange to me.
March 8, 2008 4:12:48 PM

Morbius said:
Which would be fine if it was only the 3870 X2, I've seen the same problems with just the single 3870 by itself. Is that also still considered new tech? These cards don't seem to do well in many DX10 games. Lost Planet being one that stands out the most. That's kinda funny as I read that they are built moreso for DX10 and are "ahead" of the game with 10.1. There seems to be a disconnect between the marketing department and the actual performance.

I think this clearly shows it is the drivers. Each game has to be optimized on AMDs end it seems. I used to think it was the game developers that did that part. I guess (if I bought and ATI card) I would just have to hope that AMD eventually gets to whatever game I'm playing at the time to optimize it. If it they don't then "shrug" oh well!

I'll admit that I've been out of the loop and my last ATI card was an X800, which was a great card at the time to me.

Unfortunately, I need to build a new machine now due to certain personal circumstances. So I only have what is currently available to choose from. Of course I don't want to just throw any old part in the box. I don't seem to remember there being this much of a split in performance from one game to another back with my older card. Like you say, though, this is new technology.

I'm sure I'm not the only one that is a little put off (confused) by these mixed reviews.

I am sort of playing devils advocate here as I really want to like ATI's current offering. But with the AMD take over, all the games with the Nvidia logo etc it's a battle between my brain and my gut it seems. Haha, sounds silly, I know.


You forgetting that lost Planet is utter sh#t.

On a more sensible note, the first games with dx10 support were Call of Juarez and Lost planet, both made with one manufacturers cards in mind, the first for ati the second for nvidia. neither run great on both makers cards, only on the oone they are 'intended' for. Seeing as their now quite old, a little irrelevant and ok to play at best, is it any wonder driver profiling for them hasn't been a major priority for ati & nvidia?

ATI's dx10 drivers are coming along slowly, dx10 performance with aa OFF is actually on a par with Nvidia's I would say, putting aa on really kills them, but then again thats me thinking of world in Conflict as an example, and thats firmly in 'the way its meant to be played' program. DX10 is really soo young and soo much of a work in progress, as is vista, it shouldn't be regarded as that important until we've even seen the first dx10 only game. And thats an Alan Wake (an eternity lol) away.

Quad crossfire has just come out this week, and you expect it to be firing on all cylinders ALREADY!?!? has experience not taught you anything about new technology.....
March 8, 2008 9:14:11 PM

Well thats fine and I wish AMD/ATI a lot of luck. It seems for me right now I guess I can't take the chance on new tech like this. Looks like I'll be getting 2 8800GTS's.

Wait, I think I hear my wallet crying in the corner. ;P
a b U Graphics card
March 8, 2008 9:25:07 PM

Im not saying the R600 doesnt have its issues, its more the X2 is what Im talking about. Im thinking the 4xxx series will show much greater promise, both single and X2, and hopefully 4X will work out. Looking at Anands review, it shows that scaling in oblivion works quite well, so maybe itll all work eventually. My fear is, being the R600 arch is somewhat poorly distibuted, that for each game the drivers have to be tweaked. My hope is, the 4xxx series will not only eliminate that, but provide the performance that was to be had on the R600s
March 8, 2008 10:45:32 PM

spoonboy said:

Quad crossfire has just come out this week, and you expect it to be firing on all cylinders ALREADY!?!? has experience not taught you anything about new technology.....



You mean like NV's last quad SLI launch? [:mousemonkey:2]
March 9, 2008 6:13:49 AM

SpinachEater said:
http://www.sapphiretech.com/en/forums/showthread.php?t=...

That sounds like a GPU-z problem according to others ^ Did you look in RivaTuner?


Thanks, thats the answer I needed. I unlocked Overdrive and it shows two GPU's that I can set clocks for manually. When I tried "All" in Auto-Tune, it showed two GPU's but froze. Both are stable at 860/931.

The odd thing is 3DMark06 does the same thing, and even does it for the high end system on the same page:






Note the 425/425 clock for mine and the 432/1021 for the other guy's. In his description he overclocked the 2900XT's in Crossfire, so his core clock is not 432 anymore than mine is 425.

I'll check out RivaTuner. Haven't used it before. At least now I know it's just CPU limited (but I'll be getting a 20" LCD next week). Too bad the 9750's won't be out until May!

Morbius said:
Well thats fine and I wish AMD/ATI a lot of luck. It seems for me right now I guess I can't take the chance on new tech like this. Looks like I'll be getting 2 8800GTS's.

Wait, I think I hear my wallet crying in the corner. ;P


Your wallet's crying because the word on the street is bigger is better. Take a chance on that 9800gx2 with two PCB's. It's got to be better than a competitor's card with only one PCB :lol:  In all fairness, I think the 9800gx2 will be faster in most benchies, but that double PCB is just a bad idea for a card in this generation. Nvidia could have done better.

Really, if you have room on your motherboard, you could actually give a 9800gx2 and a 9800gtx a try in Triple SLI. That's what I want to see compared to two 3870x2's in CrossfireX.

Okay, I just checked things out in RivaTuner's Hardware Monitor, and the clocks are about where CCC says they are in idle mode for the GPU and regular for the RAM. Under what conditions would they show the 860/931 set under CCC? Do I run this in the background while gaming to generate a report?

March 10, 2008 2:30:01 AM

Yeah leave that open and then run something to tax your GPUs. You can go back, right click on the core clock graph and select mark max core clock and it will tag it for you.

I think...not 100% sure yet pretty sure...that 3DMark records your specs when you load it up so since you are just at your desktop and most likely in GPU "idle" it will record the lower clocks.
!