Sign in with
Sign up | Sign in
Your question

x1800xt or 2 sli 7800 gt?

Last response: in Graphics & Displays
Share
February 8, 2006 1:53:41 PM

I'm currently in the process of upgrading my computer and am wondering what would be the best setup for the ol'eye candy. This is obviously going to be a gaming machine. Money isn't too much of an issue, I'm looking for performance rather than cost here. Although I don't want to get too carried away. This needs to last me for at least 2 years. Should I choose to go with a x1800xt or 2 SLI 7800gt? I noticed on some games the SLI is slower in some games, but i'm mainly concerned with high quality settings and great FPS! I'll be running this on a 21" silicon graphics monitor, so I won't be running games at the insane 1900x1200 rez. Here what the computer will consist of:

-AMD 64 mobo (supports 754 socket, it's what I currently have, I currently have an agp board and will upgrade pci-e board)

-AMD 64 3200+ (I own this already)

-2x512mb (1gb) geil pc3200 ram, I also have another stick of crucial 512mb pc2100 laying around I can add to make my ram 1.5 gb. (might upgrade to 2x1gb sticks (2gb total))

-?video card?

-2x seagate 80gb drives in raid0 config (own
these already)

Any feedback would be appriciated! thanks!

More about : x1800xt sli 7800

February 8, 2006 2:28:56 PM

I noticed the vga review on toms....so hdr isn't too good on the sli it looks like. hmm. I meant the x1900 vs. SLI
February 8, 2006 3:05:14 PM

If eye candy is your thing, I'd have to recommend the X1900, based on the fact that the 7800 series has a hardware limitation that prevents it from using Antialiassing in games with OpenEXR HDR lighting.

The X1900 is all about eye candy.
Related resources
a b U Graphics card
February 8, 2006 4:07:17 PM

In addition to what everyone else mentions, think about the upgrade path too. 2xGTs is the end of the line, 1 X1900 give you an option to add another similar card later.

Of course it might still be cheaper and wiser to simply sell your old card(s) and buy a new one at that point.

Just two things to consider.

And another vote for the X1900XT, but really it's going to depend alot on the games you play and the settings you use.

Take a look at a few reviews that have the 2XGT against an X1900 and then see if it play what you like how you like.

But Cleeve's right, even beyond the OEXR-HDR+AA issue, ATi's efficient implementation of AA usually allows you to go to 6XAA for the same price as 4XAA on the GF7, and even at 4X there's usually a performance advantage, but that advantage still won't help enough in some games, where the 2xGT will beat both the X1900 and the GF7800GTX-512.
February 8, 2006 6:49:36 PM

wahs ur 3200 clocked at?

if you're worried about performance, do not even consider running 3x512, sell ur ram and replace with quality 2x1gb

if you're not worried about spending money, the 7800gt is not for you. the 1900xtx/xt/whatever you want is better
February 8, 2006 6:52:50 PM

Quote:
I'm currently in the process of upgrading my computer and am wondering what would be the best setup for the ol'eye candy. This is obviously going to be a gaming machine. Money isn't too much of an issue, I'm looking for performance rather than cost here. Although I don't want to get too carried away. This needs to last me for at least 2 years. Should I choose to go with a x1800xt or 2 SLI 7800gt? I noticed on some games the SLI is slower in some games, but i'm mainly concerned with high quality settings and great FPS! I'll be running this on a 21" silicon graphics monitor, so I won't be running games at the insane 1900x1200 rez. Here what the computer will consist of:

-AMD 64 mobo (supports 754 socket, it's what I currently have, I currently have an agp board and will upgrade pci-e board)

-AMD 64 3200+ (I own this already)

-2x512mb (1gb) geil pc3200 ram, I also have another stick of crucial 512mb pc2100 laying around I can add to make my ram 1.5 gb. (might upgrade to 2x1gb sticks (2gb total))

-?video card?

-2x seagate 80gb drives in raid0 config (own
these already)

Any feedback would be appriciated! thanks!


AGP to PCIE is a big upgrade. Unless you have a MB with both, you won't be keeping your MB, and if you replace your MB, you might not be keeping your socket 754. And if you are looking at crossfire, you know you have to replace both MB and CPU. Besides, it's a good opportunity to switch to a dual on a 939 board. As far as I can tell, the few games I play seem to use both cores (I'm only assuming it's to my benefit though, no numbers to prove it).
February 8, 2006 7:05:13 PM

That's not really true. Truthfully it's all about what flavor you want. Considering the only game out right now that uses Open EXR HDR is Serious Sam II, so it's not really that big of a deal. As far as I know ATI cards can not do hardware filtering on floating point render targets. (Not as big of a deal)
Although with an X1900 you do get the choice to add a second card later.
February 8, 2006 7:32:19 PM

Myself the future is closer than one thinks when computer parts are changing ever few months. I am sure the ati will play any game in the future with the 1900. I say this because of the bench marks compairing the cards, so far the ati shows better results. I am a nvidia dude since 3dfx went down.
February 8, 2006 8:15:41 PM

it depends on what you mean by you want this system to last two years. chances are, no matter what you get today, it wont be able to run the highest end games at decent settings in 2 years.
February 8, 2006 8:18:24 PM

Then again, if you can wait for a few months.. (I know it's hard ;) ) You should perhaps get a DX10 card. If you buy a DX9 card, you won't be able to play games with eye candy at max after a year, when the first DX10 games roll out. Think about Crysis, for example.

I've heard ATI and nVidia are both getting their next gen cards out soon. G80 and R600 or something like that. :) 
February 8, 2006 8:32:43 PM

Quote:
just in case youre forgetting he wants this card to last a couple of years so its future as well as current games that need taken into consideration. that alone should exclude the older 7800.

Too bad there isn't anything out right now for that.
February 8, 2006 8:35:53 PM

Quote:
Open EXR HDR is Serious Sam II, so it's not really that big of a deal.


Quite the opposite, pretty much every game doing HDR out there is using OpenEXR nowadays... Far Cry, Age of Empires 3, Serious Sam 2, Splinter Cell - pretty much every new game. Upcoming games too, like the next Unreal engine, Oblivion, etc.

OpenEXR is becoming more important, quickly. And you simply can't use HDR & AA with all of those new games on a 7800 based card.

The only notable excerpt is the Source engine (Half-Life2, Counter Strike, Day of Defeat), which uses a proprietary HDR method.
February 8, 2006 8:39:56 PM

I've got the 7800GT and I love it but in your case you have got to go for the x1900 XT.

Lakedude = not a fanboy
February 8, 2006 8:43:06 PM

Incorrect about Age of Empires III. I use 4X AA and it works correctly with HDR enabled.
February 8, 2006 8:44:25 PM

Just to clarify, the 7800 GT is by far the best price performance card out there. Great stock and awesome overclockers, and for the money nothing will beat two 7800 GT's in SLI

But if you have an unlimited budget, and eye candy/future-proofness is your thing, the X1900 is the clear choice. It's just that simple.
February 8, 2006 8:45:26 PM

I am correct, actually. AOE3 does use OpenEXR, but they have some kind of proprietary workaround... from what I can understand, they revert to a non-EXR method of HDR when AA is used on a 6800/7800 card.

They're the only ones who do that so far though. :( 
February 8, 2006 8:50:09 PM

7800gts in sli... mite as well spring for a 1900 at that price... my opinion is that sli should only be useful with the highest end cards... for lower end cards, a single better card is often better
February 8, 2006 8:58:35 PM

If that is truly the case, then any game developer that chooses not to this isn't getting rewarded for their laziness. Not with my money.
February 8, 2006 9:05:59 PM

geil is pretty nice memory, but dont put that 2100 stick in, that will basically ruin ur performance, itll slow the other sticks to 2100 speeds, so the 1 gb of 3200 will be much faster than 1.5gb of 2100... if u do anything get another 1gb kit of 3200, 2x 512... if u have 4 dimms, if not another 1 gb stick will do the same...
February 8, 2006 9:10:01 PM

Well, it's not really their fault the 7800's have this limitation built in; they simply don't have the capability.

Conversely, you could say that no-one should buy any games that require DirectX 9 graphics because you can still buy an ancient DirectX 8 card.

But that's silly. If we all adopted that attitude we'd still be using Voodoo 1's.

It's the card's limitation, not the developers... you can choose to buy a card that will play all the new titles with AA & HDR, or buy a card and boycott all the titles that don't jive with your hardware... I mean, if you did that, you'd miss out on pretty much every major title that's ever coming out again...

The funny part of this is, alot of those titles are sponsored by Nvidia: "The way it's meant to be played"... :) 
February 8, 2006 9:32:31 PM

Well I'd like you to tell me if you're able to tell the difference between the regular OpenEXR HDR and the AOE3 version. What's the purpose of using something, simply because it will hurt people using certain hardware? If there is no perceivable difference, why use it?

I also find it funny though, on Nvidia's website they say "True HDR support".
February 8, 2006 9:54:28 PM

Hmm, well I may not have as much money as I thought to spend on a computer now. So I was looking at the x1600xt and it looks like a very good card and for the price not horrible at all. It doesn't look like i'd be sacrificing too much going this route.
February 8, 2006 10:19:39 PM

The X1600 isn't suppose to be a very good card from what I've heard, unless your budget is that much smaller.
February 8, 2006 10:42:30 PM

I've looked at screenies carefully and I can't see anything myself, but supposedly it has to be done at a lesser bit depth since it's OpenEXR.

Remember, Nvidia does have "true HDR support", they just don't support Antialiasing with it...

I find it as dubious as you do, because OpenEXR support is one of the main Shader Model 3 banners anti-Ati entheusiasts have been waving for a long time to diss the X800 series. But if they can make OpenEXR-like HDR without using the OpenEXR method (like they can in Half-Life 2 - which looks gorgeous if you've seen the lost coast level) why don't they?

I've asked around alot over this very question, but no-one has any concrete answers except that hardware reviews never cease to mention the 6800/7800's limitation of no AA & OpenEXR at the same time, and that you can't use them together if you have Nvidia hardware on all the major upcoming titles.

The whole issue is strange to me, but I've never dug to the bottom of it.
This conversation has reminded me that it bugs me... I think I'm going to dig a bit more.
a b U Graphics card
February 8, 2006 10:54:08 PM

Quote:
I'm not too sure about the gt beating the 1900 the benchmarks show ati the better. I'm a nv guy but I was very impressed with the results.
http://www.tomshardware.com/2006/01/24/ati_radeon_x1900_heats_up_with_48_pixel_pipelines/


LOL!

I am sure. Why? Because that's the way it is. Re-read what I said, I didn't say the GT would win more, just check the settings and the games/apps, since BOTH have their strengths and there are cases where the SLi GT does have strength like I said;

Some clear GT wins....

As a SINGLE GT (no need to enable SLi even);
http://www.firingsquad.com/hardware/ati_radeon_x1900_xt_preview/page12.asp

Then depending on settings some are better than others (other reviews show different results as well);
http://techreport.com/reviews/2006q1/radeon-x1900/index.x?pg=6

I've only been able to find single card (GT versus XT) Riddick Reviews (Digit-Life still doesn't have their new Digest yet), but they aren't very flattering of the X1900 either even on a 1:1 basis until you crank things all the way up where, like I said, the X1900 does have an advantage.
a b U Graphics card
February 8, 2006 10:59:59 PM

Quote:

I've heard ATI and nVidia are both getting their next gen cards out soon. G80 and R600 or something like that. :) 


You heard wrong. Right cards, but nowhere near 'soon'. :roll:
a b U Graphics card
February 8, 2006 11:09:03 PM

Quote:
Well I'd like you to tell me if you're able to tell the difference between the regular OpenEXR HDR and the AOE3 version. What's the purpose of using something, simply because it will hurt people using certain hardware? If there is no perceivable difference, why use it?


Because it takes ALOT of time/money to create these workarounds.
Valve did it with HL2, and Bethesda supposedly has also done it for Oblivion (won't know for sure 'til it ships in March).

While you say there is no ifference between the two, is that not also the case with HL2 which supports HDR for ATi's PS2.0+ cards? Ubisoft finally added HDR and specular lighting support for their SM2.0 path long after launch (when it was SM3.0/1.1) but that's because there was a 'demand' for it or the time/effort was felt to be justified. Adding OpenEXR + AA support for FartCry came later too, just like so many other features. The problem is that most publishers will not go to the lengths of a Crytek or Valve in order to code for these exceptions. Somethings are easier (like adding geometric instancing support for the VS2.0+ on the Radeons), and then other are hard (Adding seperate HDR support for different vendors). It's just a question of worth, and while they'll do easy workarounds for quite some time, the harder ones will likely be ignored once there's enough new cards that support the normal method to replace them. It's all about economics son. :wink:

Quote:
I also find it funny though, on Nvidia's website they say "True HDR support".


Well that replaced their "Only card to support HDR" banner which was equally hillarious. :roll:
February 9, 2006 1:46:43 PM

Quote:

I've heard ATI and nVidia are both getting their next gen cards out soon. G80 and R600 or something like that. :) 


You heard wrong. Right cards, but nowhere near 'soon'. :roll:
Soon, as in a few months. Before summer, right?

Then again, there's always something new coming 'just around the corner' so you might just as well buy a good card now and not worry about it. :D 
February 9, 2006 9:47:27 PM

I understand that it would be more expensive to create a work around, however if it's already been done and these different methods are public knowledge, then how much more expensive could it honestly be?
a b U Graphics card
February 10, 2006 12:27:46 AM

Quote:

Soon, as in a few months. Before summer, right?


Nope, maybe if we're very VERY good Santa will bring them for Xmas. ATi and nV still need to leverage and profit off of their current lineup, they can't release cards as quickly as they did the X1900 and GF7900 for generation changes it'd never recoup their R&D costs.

Quote:
Then again, there's always something new coming 'just around the corner' so you might just as well buy a good card now and not worry about it. :D 


Exactly, if you're smart like Cleeve, you can then sell your card for almost as much as the new one and upgrade to a much better card/hardware for $50-100 (or less if you're Cleeve 8) ).
a b U Graphics card
February 10, 2006 12:33:30 AM

Quote:
I understand that it would be more expensive to create a work around, however if it's already been done and these different methods are public knowledge,


Is it public knowledge? I was wracking my brain last night trying to figure out the limitation, and I still don't get it but two methods. If you can show the published method that AOE3 uses that would be great. Because the thing that doesn't make sense is why does SSAA work but not MSAA if SSAA is more memory intensive, what's the limitation? The algorythm?

Quote:
then how much more expensive could it honestly be?


Time to code, everything is an added expense, and the two ways I figure AOE3 could make this work involve either alot of extra work (specified AA calls done on system resources) or just some extra work (change output results prior to AA processing). both require additional coding and thus, additional man hours. That something is known, doesn't mean that it's something everyone wants to optimize for if it requires alot more man hours to add.

If you have the answer to exactly how they are doing it that would be an interesting read, because even nV hasn't fully come out and exposed their weakness by explaining why it doesn't work on the G7x but does on the X1xxx.
February 10, 2006 12:43:25 AM

SSAA renders the image at a higher resolution than MSAA, and then reduces the size to be displayed onto your monitor. So say you use a game at 1280x1024 with 4XSSAA your game will be rendered at 2560x2048 and resized to your 1280x1024 resolution.
Edit: Now that I've thought about that, supersampling would make perfect sense why Age of Empires III can use HDR with Antialiasing.
February 10, 2006 12:49:44 AM

Well, I ended up splurging on this computer. I bought:

Asus A8N5X NF4 939 socket board
AMD 64 3700+ San Diego Core (1mb L2 cache)
2x1gb Corsair (2 gb kit) XMS pc 3200 ram
x1900xt 512 mb video card
Aspire 500w PS
Cooler Master Centurion Case

I can't wait to get this! Thanks for all the feedback.
February 10, 2006 12:50:32 AM

Good pick. A friend of mine has two X1900 XTXs and he loves them.
February 10, 2006 12:57:42 AM

2? wow that is spendy! I thought about doing that but thats way too hardcore for me! I'm just beside myself that i'll be ready for oblivion now!
a b U Graphics card
February 10, 2006 1:24:56 AM

Quote:
SSAA renders the image at a higher resolution than MSAA, and then reduces the size to be displayed onto your monitor. So say you use a game at 1280x1024 with 4XSSAA your game will be rendered at 2560x2048 and resized to your 1280x1024 resolution.
Edit: Now that I've thought about that, supersampling would make perfect sense why Age of Empires III can use HDR with Antialiasing.


That doesn't make sense as to why it would work. If the limitation is FP16 x AA = buffer issues, then how would SSAA solve that by increasing the size? Remember that you're not dealing with the integer value in the pipeline itself, it's still trying to either apply AA to the FP value or adding HDR FP calculations to the increased AA. So why SSAA and not MSAA, while the ATi can do MSAA, it has to be the algorythm itself. ATi does have some memory management and buffer sharing advantages but it still doesn't add up outside of that methodology.

So still I ask, Glitch or management, because outside of what AOE3 does to assist, it doesn't work. Remember SSAA isn't a problem for NORMAL OpenEXR + AA, it's the MSAA that is broken and somehow fixed in AOE3.
a b U Graphics card
February 10, 2006 1:26:38 AM

Quote:
2? wow that is spendy! I thought about doing that but thats way too hardcore for me! I'm just beside myself that i'll be ready for oblivion now!


No one's ready for Oblivion IMO, I think it's going to be beyond current hardware for a while, kinda like it was with Morrowind. Sure you can turn most stuff on, but not up.

We'll see, I could be completely wrong, but not if you read everything the devs are saying.
February 10, 2006 1:27:51 AM

I believe you that it will be heavy on the hardware side for good graphics. I wouldn't be surpised at all.
a b U Graphics card
February 10, 2006 1:31:27 AM

Yeah an I'm also wondering about all the AI, physics, and other things that are going to hammer the CPU. Should be truely CRUEL to watch this game 'enable all features on high' on even the most OC'ed rig. There won't be the benifit of a PhysX card either.

That is why I'm waiting 'til around March 20th to pick my parts for the next rig.

Just hope it's not too ridiculous for enabling most of the features.
February 10, 2006 8:10:01 AM

Quote:

Soon, as in a few months. Before summer, right?


Nope, maybe if we're very VERY good Santa will bring them for Xmas. ATi and nV still need to leverage and profit off of their current lineup, they can't release cards as quickly as they did the X1900 and GF7900 for generation changes it'd never recoup their R&D costs.

What you're saying is right, but last time I checked (as in, 5 minutes ago :p ) nVidia might be releasing them early Q3 or even in April! Then again, these are rumors, so you never know ;) 

But wouldn't it be quite profitable for nVidia to release their DX10 cards HALF a YEAR before ATI? The new cards would be the choice for anyone, who wants a top of the line card. It could be a smart move 8)
February 10, 2006 8:34:43 AM

Quote:
geil is pretty nice memory, but dont put that 2100 stick in, that will basically ruin ur performance, itll slow the other sticks to 2100 speeds, so the 1 gb of 3200 will be much faster than 1.5gb of 2100... if u do anything get another 1gb kit of 3200, 2x 512... if u have 4 dimms, if not another 1 gb stick will do the same...


I'm glad you said that because it seemed like know read that far down on the first post.Also using 4x512 you might have to use 2T command rate which will drop performance a little bit, not much though.If money is an issue at some point in your upgrade than use the 512 sticks you have and add to them.The ideal option being 2x1024 for 2GB though.
a b U Graphics card
February 10, 2006 9:38:53 AM

If he uses 4*512MB, he will likely have to run at DDR333 speeds also because of the on-die memory controller limitations.
a b U Graphics card
February 10, 2006 4:32:21 PM

Quote:
What you're saying is right, but last time I checked (as in, 5 minutes ago :p ) nVidia might be releasing them early Q3 or even in April! Then again, these are rumors, so you never know ;) 


I'd like to see where those rumours are coming from, because I doubt them very much.

Quote:
But wouldn't it be quite profitable for nVidia to release their DX10 cards HALF a YEAR before ATI? The new cards would be the choice for anyone, who wants a top of the line card. It could be a smart move 8)


Only if it's for the mid range, because bringing something high-end out for summer would be ridiculous and cut into the sales of the G71, which they would only get 2-3 months to recoup R&D costs (pretty much $100 million per high end card). This being the last generation of the G7, they'd likely want to keep it in the market for quite some time.

Probably what you heard were people who don't know what they're talking about thinking that the G71 will be a DX10 capable card. It's scheduled to release in March/April, but it won't be SM4.0/DX10 featured.
February 10, 2006 5:14:24 PM

Quote:
I'd like to see where those rumours are coming from, because I doubt them very much.

Oh sure I'll link you to a few sites ;) 

http://www.vr-zone.com/?i=3177 (Fri, Feb 10, 2006)
Quote:
We heard that G80 will be in time for launch in June --

Quote:
As for ATi, the next generation R600 is slated for launch end of this year according to the roadmap we have seen --


http://www.techspot.com/news/20331-details-on-nvidia-g8... (Wed 01 Feb 2006)
Quote:
According to some rumors, though mostly word of mouth, nVidia's newer line of GPUs in the G80 series are set to be around this April.


http://www.mikeshardware.co.uk/RoadmapQ306.htm
Quote:
nVidia G80 GPU is expected to be released in H2 (possibly as early as June)


Now, I do realize these are all rumors. We can't know absolutely sure G80 will be released at some points until we hear from nVidia. And even when we do get a 'release date' from nVidia, we STILL can't be sure the cards will be available then. :p 

About the reliability of these sites..well, it is stated in a lot of sites, but they're usually just copying each other. But we'll see when one of us actually get one of these cards ;) 

But you do agree that having a card that's faster and more future-proof (DX10) than any ATI card could really get nVidia customers. Since it seems ATI won't be releasing the R600 in a long time. Maybe they'll make it this year, but who knows :p 

And yeah, I AM an optimist :) 
February 10, 2006 5:44:51 PM

We still know nothing. :( 
February 10, 2006 5:48:08 PM

if you feel like messing with ATI drivers go with ATI. Nvidia is much better driver and performance wise
February 10, 2006 6:46:24 PM

All of my experience with ATI drivers has been decent, but recently they have gotten a lot better.
February 10, 2006 6:46:40 PM

iv had 3 ati cards and none have had any problems with drivers, the 9600 overheated because it was a pile of trash but that wasnt drive related. if u dont want to "mess" with ati drivers, then dont download catalyst and u wont have to do anything.
February 10, 2006 7:16:02 PM

Quote:
if you feel like messing with ATI drivers go with ATI. Nvidia is much better driver and performance wise


This is pure FUD. You post to complain and offer absolutely no examples of your problems.

The X1900 benches I've seen from every review site show your claims of Ati performance inferiority to be BS.

I've owned enugh Nvidia and Ati cards over the past 10 years to know the drivers are equally good.

So please, justify your fanboy claim, or are you just spreading $hite?
!