crossfire question

Surray

Distinguished
Dec 27, 2004
8
0
18,510
Hi there!

I gotta upgrade my graphics card for the upcoming Elder Scrolls 4 & co so I'm trying to find out what's the best thing to get right now (for someone without unlimited money)

What I was planning was a crossfire mobo, one X1800 XT 512 now (which I found for 400$) and then in a year or two, get an X1800 XT Crossfire Master Card and use those for a while.
But now it seems the 1800 XTs are getting phased out and I'm afraid in a year I won't be able to get an 1800 XT crossfire anymore but only 1900s.

So my question is, can I use an X1900 XT Crossfire Master with an X1800 XT? I read somewhere your second card has to be as fast as or faster than the Crossfire Master :(
 

Vokofpolisiekar

Distinguished
Feb 5, 2006
3,367
0
20,790
Unfortunately, I have no knowlegde of the new Elder Scrolls performance requirements - but if the videos are anything to go by, you'll need quite a hefty gcard.

I'd personally recommend getting a 1900XT - it's made for very intensive graphics, and HDr can be added into that equation - all features of Elder Scrolls Oblivion and future games this year, bar DX10 games.

Furthermore, ATi plans to release the X1900 256mb - which will compete directly with the 7800 GTX 256mb. So, prices will be falling and if you wait a month or so, you can get the X1900 256mb or a XT, and later the crossfire card if it's available for the X1900 256mb or if you going to purchase a 1900XT, then just get the master X1900 that's available now.

If you'd like to see the difference between the x1800 and x1900, just look at Toms review of the new X1900 series - note games like Black and White 2 and AOE3 - the increase in performance of the X1900 above the X1800 cards are quite impressive, and it's a clear indication of the power at hand on any X1900 card :twisted: :twisted:

Edit - OOPS, I never answered your main question. I really don't know whether the X1800 will run with the X1900 in crossfire mode - my guess is no, the X1900 is using 48 pixel proccesors, where the X1800 is using 16 (I think) - so if they could run together hypothetically, the X1800 will just slow down the 1900 - ONCE AGAIN, THIS IS MY GUESS. I'll probably be proven wrong with the next post... :oops:
 
I agree with Vokofpolisiekar, X1900 will be much better option. It's the best card out there at this time. In a year, X1800 will become mid-range at best, so there won't be a reason to have two of them.

I think 'at best' is a little strong, I think it will easily be midrange within only a year, but I don't expect it to be number one, and I expect there to be a second in line from both companies to push the X1800/1900 to top of the middle, but not further.

And with unlimited money (as you said)

Umm... that's "without"....
A little different, he talking about bang/buck.

, you can get a X1900XT that will beat dual X1800XT cards. IMO, get a X1900 now, and don't Crossfire later. If you want dual cards, get them NOW, do not wait, you'l just waste your money.

I disagree, for the budget minded, the X1900XT (like a connect3D version) make the most sense. And perhaps and X1800XT if you can't afford the difference, but the shader power will help games like Oblivion ( ES-IV ) based on the features they are touting.
I would avoid the plain X1900-256 if you are ever considering Crossfire, as it will make your crossfire system a 256MB limited system, and I wouldn't be surprised if more often than not 2 Crossfireed X1800XT-512s would outperform 2 Crossfire X1900-256s, since at the settings these cards would excel at they are going to start taking up memory real estate rather quickly, especially for a game like Oblivion.

I do believe that you can have a X1900 Master with a X1800.

I do not think you could Crossfire the two succesfully. The most important part of ATi's implementation of Crossfire versus SLi is the number of pipelines not the clockspeeds, etc. While a faster card will Crossfire with a slower one, I doubt that 2 cards with such different architectures would Crossfire successfully without some work-around, or without the X1900 in some way mimicking the fewer processes on the X1800.

Very unlikely IMO, but not unthinkable.