Sign in with
Sign up | Sign in
Your question

ati x850xt crossfire vs x1800xt

Last response: in Graphics & Displays
Share
July 8, 2006 5:10:06 AM

hello I was wondering what would be better a x850xt in crossfire vs x1800xl
a b U Graphics card
July 8, 2006 1:37:28 PM

I wouldn't doubt that the X850XT crossfire setup could at times match the X1800XT in performance, but IMO there is a clear winner. X1800XT has a feature set advantage that dual X850Xt just can't do. And while I think the "buy a 6600GT over X800GTO or 6800 over X800XL because they have SM3.0" arguement is just plain stupid (going with slower cards just for features they are too weak to use), by the time you talk dual X850XT vs X1800XT, you just plain want an SM3.0 card as you have the power to get something out of it. Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).

Anyway, which is better..... X1800XT for sure IMO, whether it takes the performance crown or not. I'd love to play around with crossfire seeing how X850XT master cards are so cheap from ATI right now, but it's not the better choice of the two. If you are in the USA looking to buy, get a $325 X1900XT or if that's too much, a sub $250 256MB X1800XT or 7900GT.
Related resources
July 8, 2006 2:16:18 PM

I'd go for X1800XT. It will give you way better performances, and support for Pixel Shaders 3.0 is gonna be more than helpfull on newer games.

Now, if you meant X1800XL, I think X850XT crossfire might perform slightly better at high resolution, but I would avoid it for it's limitation to PS2.0, which might a problem in the short future.

My 2 cents.
July 8, 2006 2:28:53 PM

Dude, the Inq?? WTF are you smokin'? :lol:  :tongue:

I would go for the X1800XT, also. Better feature set, and the card is just GREAT.
a b U Graphics card
July 8, 2006 2:37:38 PM

:oops: 
July 8, 2006 4:21:01 PM

Quote:
Dude, the Inq?? WTF are you smokin'? :lol:  :tongue:

I would go for the X1800XT, also. Better feature set, and the card is just GREAT.


What have you all got against the Inquirer. :?: :)  No offence prozac26. :wink:

In this case the link is perfect for the question above. Great link pauldh!
July 8, 2006 5:19:46 PM

Quote:
What have you all got against the Inquirer. :?: :) 

They suck, they're fanboy stupid idiots, how can't write properly. Although Paul's link was pretty good.
July 8, 2006 6:02:55 PM

Prozac26, it's a point of view I respect. I just don't know why you say they're a bunch of fanboy? I know I'm off topic, but could you answer?

I never felt they were rough with anybody who didn't deserved it, but I might be wrong.

And by the way, even tough I'm french speaking (I'm from Montreal, Canada), it's: who can't write properly. :wink: :twisted:

No offence I hope! :) 
July 8, 2006 6:07:38 PM

They're stupid AMD fanboys, they make up stupid stories that "AMD has this and it's much better than Intel". If you'd read the CPU section daily, you'd know.
July 8, 2006 6:28:44 PM

Quote:
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).


Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!
July 8, 2006 6:29:43 PM

I'll leave like this, so no need to answer

But lately they were pretty much singing Intel praise with their Core architecture, not the opposite way, so it comes back to what I was saying: they blast those who desserve it.

In case you think I'm myself a fanboy, I presently use a P4 3,0-800 Northwood, but I would have used A64 if I would have know I was gonna game so much.

Respect! :wink:
July 8, 2006 6:32:01 PM

Quote:
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).


Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!

Unless wrong, he's right, because I remember using HDR on HL2 demo 6 or so month ago. I know I put it on in the display feature, so I think it was on. It did look amazing at least.

Anybody can say?
July 8, 2006 6:39:07 PM

yeah the x1800xt is better... and then down the road when dx10 cards come out and it drops in price u can crossfire it and be pleased that u didnt crossfire the x850xt, and i was wonderin, how much does the x850xt crossfire edition cost? or all x850's crossfire compatible?
July 8, 2006 6:54:17 PM

Quote:

Unless wrong, he's right, because I remember using HDR on HL2 demo 6 or so month ago. I know I put it on in the display feature, so I think it was on. It did look amazing at least.

Anybody can say?

Actually, they released HL2:Episode One which is designed completley in HDR... Just played it yesterday on my SM 2.0 X800XL card. I don't see what designing HDR in a SM 3.0 card accomplishes... We were actually having a discussion about HDR in Episode One Here

Quote:
or all x850's crossfire compatible?

Yes. You can use down to the X800 in a Crossfire setup.
July 8, 2006 7:06:39 PM

Also that X1800XT had a lower core speed so in reality it would perform even better.
a b U Graphics card
July 8, 2006 7:17:22 PM

Quote:
Dual X850Xt could rock at Oblivion but it will never let you turn HDR on in Oblivion, which looks great and is the way you want to play with $250+ worth of GPU(s).


Well, if Oblivion requires SM 3.0 to run HDR, then the people who designed that game are idiots. HDR runs fine on 2.0.

Although I concurr. Get the X1800 over the dual X850s!

There are different ways of implementing HDR. The method used in HL2 lost coast and episode 1 are doable on any DX9 card. The method used in Oblivion, is OpenEXR HDR, and it only is possible on SM3.0 cards. Other games I believe that are using this method of HDR are Farcry, Serious Sam II, Age of Empires 3, and Splinter Cell Chaos Theory. So in those games, you need a SM3.0 card to run HDR. And to be honest, you need a fairly beefy SM3.0 card or the framerates/ sacrifices just aren't good enough to bother.
July 8, 2006 7:44:18 PM

Another interesting sidenote: Age of Empires III can use antialiasing with Nvidia cards when HDR is enabled.
July 8, 2006 7:51:40 PM

Ah, I see. I actually own Splinter Cell: Chaos Theory, and i believe it's one of the most graphically jaw-dropping games around. And, it actually does support HDR on SM 2.0. Although, in Chaos Theory, it doesn't seem to work very well... it's sort of glitchy.

Sometimes, I'll walk into a well-lit room and I will see no eye adjustment, or lighting change from HDR whatsoever. And I think it responds to which way your character (Sam) is looking... not to which way the camera is looking which can be completley different at times. They should wait to develeop games for SM 3.0 untill cards can truly handle it at good framerates.

P.S. @ Heyyou27: In HL2 with HDR enabled, you can run anti-aliasing on any card. They shouldn't make something like that exclusive to Nvidia in any game...
a b U Graphics card
July 8, 2006 8:08:05 PM

Quote:
Ah, I see. I actually own Splinter Cell: Chaos Theory, and i believe it's one of the most graphically jaw-dropping games around. And, it actually does support HDR on SM 2.0. Although, in Chaos Theory, it doesn't seem to work very well... it's sort of glitchy.

Yeah, you got me on that one. I forgot that there was supposedly HDR with the patch w/ SM2.0 path . It's even right in the advanced video menu and not command line activated like Farcry HDR. Right? Back when I played it, that wasn't the case. I found HDR hammered my 6800U way to much back when the SCCT demo first came out. But then again, that must me another method of HDR that ATI pushed Ubi into doing; it can't be OpenEXR HDR. Where's Grape when ya need an explanation? I know he explained the AOE3 HDR before.
a b U Graphics card
July 8, 2006 8:22:55 PM

Quote:
Another interesting sidenote: Age of Empires III can use antialiasing with Nvidia cards when HDR is enabled.


So I hear. I guess it's Farcry, Serious Sam II and Oblivion that are using the method of HDR that NV can't do AA along with because it's not supported in theri hardware.
July 8, 2006 9:10:03 PM

u stupid bitches go s lick ut momas pusssy
July 8, 2006 9:10:26 PM

u stupid bitches go s lick ut momas pusssy
July 8, 2006 9:41:45 PM

So, SM 3.0 cards "do" HDR "super trick tweek" codes, while SM 2.0 cards do "lesser" HDR, correct?

I have a mildly OC's X800GTO with pipes unlocked, and I have HL2 Lost Coast and HL2 EpiOne. I "like" the HDR effects ... definitely. So, guys, here's my hot 'n burning question: has anyone, anywhere published a review of SM 2.0 vs 3.0 HDR? Is the stupidly expensive and inordinately GPU cycle gobbling SM 3.0 HDR "worth" the finite differences?
July 8, 2006 10:47:48 PM

Quote:
u stupid bitches go s lick ut momas pusssy
I see we have another one with a bitch complex.
July 10, 2006 4:38:03 PM

Thanks pauldh for the explanations. Is it any difference in quality of rendering between PS2.0 compatible version of HDR and PS3.0 version?

My Radeon 9800Pro did a good job at rendering it at 1024*768 at least and look great. I can't compare to the on in Oblivion since my VPU can't handle it.
July 10, 2006 4:43:21 PM

Quote:
u stupid bitches go s lick ut momas pusssy


Hey asshole, shut up if you don't have anything relevant to say.

It's not ali's problem if you like to lick your mom pussy, so shut up!

Respect is the begining of understanding!
!