ati x850xt crossfire vs x1800xt

pauldh

Illustrious
I wouldn't doubt that the X850XT crossfire setup could at times match the X1800XT in performance, but IMO there is a clear winner. X1800XT has a feature set advantage that dual X850Xt just can't do. And while I think the "buy a 6600GT over X800GTO or 6800 over X800XL because they have SM3.0" arguement is just plain stupid (going with slower cards just for features they are too weak to use), by the time you talk dual X850XT vs X1800XT, you just plain want an SM3.0 card as you have the power to get something out of it. Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).

Anyway, which is better..... X1800XT for sure IMO, whether it takes the performance crown or not. I'd love to play around with crossfire seeing how X850XT master cards are so cheap from ATI right now, but it's not the better choice of the two. If you are in the USA looking to buy, get a $325 X1900XT or if that's too much, a sub $250 256MB X1800XT or 7900GT.
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
I'd go for X1800XT. It will give you way better performances, and support for Pixel Shaders 3.0 is gonna be more than helpfull on newer games.

Now, if you meant X1800XL, I think X850XT crossfire might perform slightly better at high resolution, but I would avoid it for it's limitation to PS2.0, which might a problem in the short future.

My 2 cents.
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
Dude, the Inq?? WTF are you smokin'? :lol: :tongue:

I would go for the X1800XT, also. Better feature set, and the card is just GREAT.

What have you all got against the Inquirer. :?: :) No offence prozac26. :wink:

In this case the link is perfect for the question above. Great link pauldh!
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
Prozac26, it's a point of view I respect. I just don't know why you say they're a bunch of fanboy? I know I'm off topic, but could you answer?

I never felt they were rough with anybody who didn't deserved it, but I might be wrong.

And by the way, even tough I'm french speaking (I'm from Montreal, Canada), it's: who can't write properly. :wink: :twisted:

No offence I hope! :)
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
They're stupid AMD fanboys, they make up stupid stories that "AMD has this and it's much better than Intel". If you'd read the CPU section daily, you'd know.
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).

Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
I'll leave like this, so no need to answer

But lately they were pretty much singing Intel praise with their Core architecture, not the opposite way, so it comes back to what I was saying: they blast those who desserve it.

In case you think I'm myself a fanboy, I presently use a P4 3,0-800 Northwood, but I would have used A64 if I would have know I was gonna game so much.

Respect! :wink:
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).

Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!

Unless wrong, he's right, because I remember using HDR on HL2 demo 6 or so month ago. I know I put it on in the display feature, so I think it was on. It did look amazing at least.

Anybody can say?
 

Fagaru

Distinguished
Jun 15, 2006
238
0
18,680
yeah the x1800xt is better... and then down the road when dx10 cards come out and it drops in price u can crossfire it and be pleased that u didnt crossfire the x850xt, and i was wonderin, how much does the x850xt crossfire edition cost? or all x850's crossfire compatible?
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Unless wrong, he's right, because I remember using HDR on HL2 demo 6 or so month ago. I know I put it on in the display feature, so I think it was on. It did look amazing at least.

Anybody can say?
Actually, they released HL2:Episode One which is designed completley in HDR... Just played it yesterday on my SM 2.0 X800XL card. I don't see what designing HDR in a SM 3.0 card accomplishes... We were actually having a discussion about HDR in Episode One Here

or all x850's crossfire compatible?
Yes. You can use down to the X800 in a Crossfire setup.
 

pauldh

Illustrious
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).

Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!

There are different ways of implementing HDR. The method used in HL2 lost coast and episode 1 are doable on any DX9 card. The method used in Oblivion, is OpenEXR HDR, and it only is possible on SM3.0 cards. Other games I believe that are using this method of HDR are Farcry, Serious Sam II, Age of Empires 3, and Splinter Cell Chaos Theory. So in those games, you need a SM3.0 card to run HDR. And to be honest, you need a fairly beefy SM3.0 card or the framerates/ sacrifices just aren't good enough to bother.
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Ah, I see. I actually own Splinter Cell: Chaos Theory, and i believe it's one of the most graphically jaw-dropping games around. And, it actually does support HDR on SM 2.0. Although, in Chaos Theory, it doesn't seem to work very well... it's sort of glitchy.

Sometimes, I'll walk into a well-lit room and I will see no eye adjustment, or lighting change from HDR whatsoever. And I think it responds to which way your character (Sam) is looking... not to which way the camera is looking which can be completley different at times. They should wait to develeop games for SM 3.0 untill cards can truly handle it at good framerates.

P.S. @ Heyyou27: In HL2 with HDR enabled, you can run anti-aliasing on any card. They shouldn't make something like that exclusive to Nvidia in any game...
 

pauldh

Illustrious
Ah, I see. I actually own Splinter Cell: Chaos Theory, and i believe it's one of the most graphically jaw-dropping games around. And, it actually does support HDR on SM 2.0. Although, in Chaos Theory, it doesn't seem to work very well... it's sort of glitchy.
Yeah, you got me on that one. I forgot that there was supposedly HDR with the patch w/ SM2.0 path . It's even right in the advanced video menu and not command line activated like Farcry HDR. Right? Back when I played it, that wasn't the case. I found HDR hammered my 6800U way to much back when the SCCT demo first came out. But then again, that must me another method of HDR that ATI pushed Ubi into doing; it can't be OpenEXR HDR. Where's Grape when ya need an explanation? I know he explained the AOE3 HDR before.
 

pauldh

Illustrious
Another interesting sidenote: Age of Empires III can use antialiasing with Nvidia cards when HDR is enabled.

So I hear. I guess it's Farcry, Serious Sam II and Oblivion that are using the method of HDR that NV can't do AA along with because it's not supported in theri hardware.
 

ecosoft

Distinguished
Jun 23, 2004
137
0
18,680
So, SM 3.0 cards "do" HDR "super trick tweek" codes, while SM 2.0 cards do "lesser" HDR, correct?

I have a mildly OC's X800GTO with pipes unlocked, and I have HL2 Lost Coast and HL2 EpiOne. I "like" the HDR effects ... definitely. So, guys, here's my hot 'n burning question: has anyone, anywhere published a review of SM 2.0 vs 3.0 HDR? Is the stupidly expensive and inordinately GPU cycle gobbling SM 3.0 HDR "worth" the finite differences?