New system, need card for 1600x1200

trancemitr

Distinguished
Mar 26, 2006
10
0
18,510
I plan on building a new system to replace my aging one (AMD 2800+ with ATI 9600 and 1GB RAM). A little while back I bought a Samsung 204b to be used with the upcoming system. I love the monitor, but I now realize it's going to be much more difficult to run all the goodies at 1600x1200. I plan on getting a X2 4800 (which I don't plan on overclocking), 2 GB RAM (not sure which yet), and the case and hard drives will come from the old system. That leaves motherboard and video card. I generally keep my systems for quite some time, probably a couple of years. Hopefully I wouldn't have to upgrade the video too soon.

So now that I'll be running 1600x1200 I'm looking at the high end cards. I'll be playing HL2, Chronicles of Riddick (both of which I got for Christmas but haven't played yet) and then probably FEAR and Oblivion. What exactly do I need?

I've been thinking of a couple of options. One would be to save some money on the motherboard and get NF4 Ultra or similar and a 1900xt(x) or 7900GTX. If need be later I could just upgrade the one card. The other would be to go for a SLI/CF board and one video card, with the upgrade path of just buying another card later. Of course if I wait too long it may be difficult to find the right matching card, I'm especially thinking the CF edition 1900, which looks pretty spendy. I'm guessing the CF 1900 would cost more than a second 7900GTX down the road, but I may be mistaken.

I found a combo deal on the X2 4800 at Newegg. You get $50 off an OEM Powercolor 1900xt or a retail ATI 1900xtx. Even though the ATI also has a $30 MIR it probably isn't as good of a deal as you can get a retail Powercolor xtx (with lifetime warranty) for $510 after MIR.

What are your thoughts? If I go for what would currently be the cheap (relatively speaking) route, would the 1900xt do alright for me? Would the 7900GTX be a better performer for the games I'm looking at, and possibly a cheaper upgrade if I went the SLI route down the road?

Sorry for the long post. Any thoughts on these issues would be greatly appreciated.
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
My suggestion. Dont cheap yourself. X1900XT if your looking for 1600x1200.
Dont be afriad to drop some $$, becuase its definetly gonna cost you.

And unless you have an insane amount of cash, dont bother with a
multi-GPU setup. Pocket your coin that you would have spent on the
second card and put that towards a next-generation GPU.
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
I know you're going to be playing some high-end games, but if you plan on doing anything more with your PC, I highly suggest a dual-core CPU. At least get one with a 2.4ghz clock (assuming that you're sticking with AMD). The multi-core CPU won't help much with your games (though Quake 4 has a multi-core option if you update to the new beta patch), but everything else will. I'm a solid gamer (I can't even multitask IRL!), so single-core CPU's are best for me.

If you can, wait about 3-4 months when the new GPU's are released. Right now, ATi's X1900 cards seem to perform better (though slightly -- depending on the reports you read) than nVidia's 7900 cards in the SM3 tests. Everything else might favor nVidia, but barely. But the X1900 cards run very hot and loud, from what I hear.

As a comparison, I can run all my games in 1200x1920 with very high (40+) framerates on my 7800GTX 256mb card (AMD 3500+ @ stock 2.2ghz and 2gb Corsair XMS w/ Creative X-Fi sound) -- except for F.E.A.R. and Oblivion. I have to run those games in 1050x1680, though I can run them in 1200x1920 but with a lot of effects/draws turned down. FSAA does cripple some games like Riddick, but HL2 sings even with 4xFSAA and 16xAF.

With a high resolution, you can settle for a lower FSAA sampling rate -- or none at all. In Quake 4, I use no FSAA and hardly see any jaggies, maybe less than 10% of the game. Maybe.

Oblivion won't let you use HDR and FSAA together on the PC (the 360 version will, but it only runs at 720p). I've been playing it at 1050x1680, and though the interpolation is a tad irritating, it's still a gorgeous game with HDR enabled. My framerates rarely dip below 30fps, even in the wilderness.

Enough babbling.

Your choice of CPU is excellent, IMO. But only go dual-core if you're going to do a lot more than play games. You can save a couple of hunderd bucks by settling on the single-core version (I forget which number -- 3800+?). DEFINITELY get 2gb RAM (F.E.A.R. will love you for it) and a sound card if you can (the X-Fi Xtreme Music runs pretty cheap for its power).

BUT BUT BUT... one more caveat. Very soon, AMD will be releasing their new socket design which is compatible with DDR2 RAM (I think even the 800mhz variety). If, as you say, you don't want to change out your whole PC for the next 2-5 years, then wait until the new mobos are out -- and then wait a month or two to work out the bugs. The new socket design is universal I think, so you'll be able to swap CPU's as you please. And even if it's not, you'll definitely want to use DDR2 RAM. It's not that far away, and by that time the new GPU's should be out as well.

Also, there's a PCI card coming out that will take some of the physics calcuations out of you CPU. You might want to snag one of these as well. I think it's called the PhysX card by Aegis or something. If it does what it claims to (they say that 60+ developers are using it in 100+ games), then it might be better than a dual-GPU setup. We'll see.
 

angry_ducky

Distinguished
Mar 3, 2006
3,056
0
20,790
I like the 7900's, especially the GT, because they're compact, run cool, fast, and quiet, don't use much power, and don't have a huge heatsink (at least not the 7900GT).
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
AA and HDR is software-dependent, even for the X1900 cards. True, they can support both, but Oblivion won't let you enable both -- regardless of any card you have. Hopefully, they'll update that in a later patch (the 360 supports both in 720p).

Then again, it looks fine even with just HDR and interpolated, and runs at a very playable framerate. Unless you have 2x7900GTX (currently there are issues with Oblivion and Crossfire, and Bethesda says that the problem is driver-based), you'll have to "settle for less".

I was really set on upgrading from the 7800GTX 256mb to either the 7900GTX or the X1900, but I've finally got Oblivion looking really nice at a playable framerate. I'm hoping that both companies will "fix" their cards' shortcomings with their next GPU releases.

Then again, if they both do, it'll be that much harder to pick between the two...
 

angry_ducky

Distinguished
Mar 3, 2006
3,056
0
20,790
Angry Ducky, how is the Cavalier case? I'm thinking about using that in my next PC build (next year).
OK man; two things:
1) PLEASE DON'T FORGET THE UNDERSCORE IN THE MIDDLE OF MY NAME (thanks)
2) The Cavalier is a nice case. I really like the design, and how you don't need a screwdriver to remove a card that's in the PCI slot. Or to install/remove a hard drive.
 
"Also, there's a PCI card coming out that will take some of the physics calcuations out of you CPU. You might want to snag one of these as well. I think it's called the PhysX card by Aegis or something. "

Said physics card will be PCI-e, not old 33 mhz PCI with its sub 1gb/sec throughput...

This will require an SLI mainboard....not sure if it will be feasible on a Crossfire board..
 
"Very soon, AMD will be releasing their new socket design which is compatible with DDR2 RAM (I think even the 800mhz variety)."

I'm sure hoping with enough bios tweaks that the more expensive M2/DDR2-800 rigs will at least be able to match a PC3200 rig in games! :)
 
"I plan on getting a X2 4800 (which I don't plan on overclocking),"

Drop this to a 4200+, saving $300+ dollars in the process at the expense of only 1.5-2 frames per second in FEAR, but get an SLI mainboard and one single 7900GTX ...; a single 7900GTX card will render at 1600x1200 with decent framerates, albeit at non aa/af settings, which is hardly noticeable at that res....(frankly, short of staring at a wire or fence, I can never notice wether AA is on or not anyway in a first person shooter)

Save the other SLI slot for an upcoming Physics card, or a 2nd 7900TX later, whichever prevails/dominates!

(The high end systems will use double 7900GT/GTX cards on a single card, i.e., 1/2 of a quad, and the other slot will be a physics card co-processor)
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
Said physics card will be PCI-e, not old 33 mhz PCI with its sub 1gb/sec throughput...

This will require an SLI mainboard....not sure if it will be feasible on a Crossfire board..

I'm not sure which card you're talking about, but BFG is producing one that incorporates the Ageia Physx technology. The card only requires PCI 2.0.

Apparently, ATi has shown interest in using the processor in a dual-proc videocard. I'm sure nVidia has entertained such notions as well, though neither company has officially announced anything.
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
All three are nice; I'd get the 7900GT because it's the cheapest :).

My monitor's native rez is 1200x1920. While my 7800GTX 256mb card is pretty good, a few games can cripple it without the 512mb buffer (oh, and of course a faster GPU). They're mainly games that favor SM3 and/or FSAA. Quake 4 and CoD2, for instance, run very fast with FSAA turned off (you don't need it at 1200x1920, really) and playable with it on. F.E.A.R. and Oblivion, on the other hand, really need a better GPU and more VRAM -- even with FSAA turned off (Softshadows and HDR will disable it as well, but those games still need at least a better GPU).

But the 7900GT is a fine card. The one thing I really like about the 7800GT, 7800GTX and 7900GT is that they're 1-slot solutions. They're generally quieter and run at lower temps. And for a powerful 256mb card, they're pretty cheap. At the high-end range, it's a toss-up between ATi and nVidia (I'm leaning toward ATi these days), but at the mid-high-range (that's a mouthful), I'd stick with nVidia.

And sorry about the missing underscore. On the post page, all the names are underscored.... :D
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
Drop this to a 4200+, saving $300+ dollars in the process at the expense of only 1.5-2 frames per second in FEAR

So you're saying that the different between a 2.2ghz AMD CPU w/ 512mb L2 cache and a 2.4ghz AMD w/ 1mb L2 cache (per proc) is only a couple of frames in games?

I'm not trying to be smart; I'm actually very curious. I plan on a new PC build next year, and I'd like to get a dual-core CPU. I was looking at the 4800+ (2.4ghz 1mb L2 cache per proc), but if the 4200+ is almost as fast, I'd like to save a few bucks.

Where did you get your test results? I've been looking for some scores.
 
If you can, wait about 3-4 months when the new GPU's are released. Right now, ATi's X1900 cards seem to perform better (though slightly -- depending on the reports you read) than nVidia's 7900 cards in the SM3 tests.

And anything involving 4XAA pretty much.

Everything else might favor nVidia, but barely. But the X1900 cards run very hot and loud, from what I hear.

At start-up the X1900 is louder than the GF7900, but after 15-30 seconds they are quieter than the GF7900 (but it's like a decibel difference so who cares?). As for heat, yes they run hotter, but unlike the GF7900s that heat is removed from the system with the X1900's heatsink, whereas even the GTX's cooler dump 50% of the hot air in and 50% out of the case. [/quote]

With a high resolution, you can settle for a lower FSAA sampling rate -- or none at all. In Quake 4, I use no FSAA and hardly see any jaggies, maybe less than 10% of the game. Maybe.

But with efficent AA you can enable it with less of a penalty, just depends on the setting as to which is preferable. 4XAA or a bump in Res.

If, as you say, you don't want to change out your whole PC for the next 2-5 years, then wait until the new mobos are out --

Or get the AsRock AM2 compatible MoBo, which you can buy now for that future socket. It might mean running the AM2 on plain DDR, but it also mean being able to buy now, and not wait for 3 months.

Also, there's a PCI card coming out that will take some of the physics calcuations out of you CPU. You might want to snag one of these as well.

Nah, stick to the HAVOC FX model until all the physics stuff is sorted out so you don't waste money, early adopting that kinda hardware generally goes bad early and takes a while to pay off. In the mean time 1 GPU lkike the X1900 can do both, sure at a slight penalty, but better than waisting money on either dual VPU or a dedicated engine IMO. Wait until the dust settles and then decide which makes more sense.

EDIT;

So you're saying that the different between a 2.2ghz AMD CPU w/ 512mb L2 cache and a 2.4ghz AMD w/ 1mb L2 cache (per proc) is only a couple of frames in games?

It can be, depends on the game. Personally I'd go with the 2.2GHZ with 1MB cache 4400+ and OC. Save the money for elsewhere.
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
If you can, wait about 3-4 months when the new GPU's are released. Right now, ATi's X1900 cards seem to perform better (though slightly -- depending on the reports you read) than nVidia's 7900 cards in the SM3 tests.

And anything involving 4XAA pretty much.

As I said, it depends on the reports you read. Even this website has had mixed reviews about both cards. Check out Gamespot's comparison between 7900GTX SLi and X1900XTX Crossfire as well. From what I've read, the results are inconclusive at best.

With a high resolution, you can settle for a lower FSAA sampling rate -- or none at all. In Quake 4, I use no FSAA and hardly see any jaggies, maybe less than 10% of the game. Maybe.

But with efficent AA you can enable it with less of a penalty, just depends on the setting as to which is preferable. 4XAA or a bump in Res.

My point was that the 7800GTX 256mb or the 7900GT might not be able to handle FSAA, but for its price, the 7900GT is a good buy (I bought my 7800GTX back in August, well before even the X1800's were out). Sure, you could get something to handle 4xAA, but at a cost of $150 or more. At very high rez w/o FSAA in certain games, jaggies are much less noticeable and the framerate will stay very high. Not sure if many people can get a higher bump in rez than 1200x1920....

If, as you say, you don't want to change out your whole PC for the next 2-5 years, then wait until the new mobos are out --

Or get the AsRock AM2 compatible MoBo, which you can buy now for that future socket. It might mean running the AM2 on plain DDR, but it also mean being able to buy now, and not wait for 3 months.

Part of the reason to wait for the new socket design is for the DDR2 compatibility. While DDR isn't going anywhere, why not future-proof your PC a bit more at this point? I hear that the delay has something to do with the 800mhz DDR2 modules, though I wouldn't swear to it.

Nah, stick to the HAVOC FX model until all the physics stuff is sorted out so you don't waste money, early adopting that kinda hardware generally goes bad early and takes a while to pay off. In the mean time 1 GPU lkike the X1900 can do both, sure at a slight penalty, but better than waisting money on either dual VPU or a dedicated engine IMO. Wait until the dust settles and then decide which makes more sense.

I agree that it's wise to wait a month or two for new hardware to work out at least the intial bugs (since it's a PCI card, there's no rush to get one), but IF (and a big IF here) the Ageia PhysX card is everything that it's supposed to be, then a high-end GPU won't be able to accomplish the same tasks. The PhysX card doesn't just accelerate the CPU's/GPU's caculations; it actually adds physics-based effects that aren't available to stand-alone vidoecards. You can visit their website to see the difference in Ghost Recon: Advanced Warfighter. Epic has also announced that any game using the Unreal 3 engine will support the PhysX card as well. And the technology isn't anything new: PC manufacturers are already selling it as an upgrade (consumers will have to wait a bit more). Also, the technology has been used in several games in the past -- including the 360 verion of G.R.A.W. -- but only as a SDK tool.
 
"I'm not sure which card you're talking about, but BFG is producing one that incorporates the Ageia Physx technology. The card only requires PCI 2.0."

So they are going to be able to perform high speed physics computations and feed this data to the cpu and/or gpu over an ancient PCI/33 Mhz connection? Sounds rather 'PCI bus bottleneck-ish' to me...
 

cleeve

Illustrious
AA and HDR is software-dependent, even for the X1900 cards. True, they can support both, but Oblivion won't let you enable both -- regardless of any card you have. Hopefully, they'll update that in a later patch (the 360 supports both in 720p).

For the record, Oblivion is the ONLY OpenEXR game that the X1900 is 'prevented' from doing OpenEXR HDR & AA in at the same time. It is a game issue... an oblivion issue.

Unlike the 7x00 series, the X1800/x1900's don't have the limitation in hardware. I have personally used OpenEXR & AA in Far Cry... very pretty stuff.



My recommendation? a single X1900 XT (don't bother with the overpriced XTX) should do the job nicely.

Don't worry about SLI/Xfire, by the time you need another card there will be a single solution out that will blow both of your cards away, for cheaper...
Unless you really need to run at +1600 res with high AA. Then go for both cards now.
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
Don't forget about having enough RAM and a good fast HardDrive.
That might not give max frame-rates, but it willl help increase the min fps.
Get at LEAST 1 gig RAM.
Ultra-Low latency with fast timings will help another 5-20%.
A WD Raptor drive is also a good drive (costly, but you can notice the difference, speeds up everything.

Also don't forget PhysicX, That's very new and very few RUMOR games support it, but it should make all the difference by the end of the year.

I personally suggest OC your CPU a little, getting the fastest < $500 video card you can find. 'Tough it out" until the end of this year or hold-off for the K10 when that finally ships (then get new everything)
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
So they are going to be able to perform high speed physics computations and feed this data to the cpu and/or gpu over an ancient PCI/33 Mhz connection? Sounds rather 'PCI bus bottleneck-ish' to me...

I thought so too, but after seeing the demo and reading some information on BFG's website, it seems more like it takes the physics calcuations totally out of the hands of the CPU and can add even more mustard before feeding it back. It could be that the lower bandwidth is still preferable to dumping all the non-graphical computations on the CPU and bottlenecking there. When you think about it, this card at least represents the next logical step in PC gaming. Maybe we'll eventually see PCIExpress or PCI-X (how I wish) versions of the card start to pop up, but for now it's only PCI.

Visit Ageia's (and BFG's) website and judge for yourself. The cards are being sold, but only by OEM's right now. And the technology isn't exactly new; it's just that this is the first time it's really being implemented in a hardware accessory.

We'll have to wait a couple of months to find out. My guess is that its release will correspond roughly to the release of G.R.A.W. for the PC, since the game openly supports the card. I don't know of any game that's currently out that will support the card yet.

Yes, I'm as skeptical as you are, but I'm also hoping that this card is for real.
 

jrwriter

Distinguished
Feb 14, 2006
49
0
18,530
1900Xt for $434 is the best choice for resolutions that high. I have a 7900 GT oc'ed at 560mhz/1600mhz and Im telling you from experience that it won't handle any new games at 1600x1200. You wont be able to turn settings up and run games at that res on a 7900 GT. The 7900 is a great card. But seriously COD 2, FEAR, Serious Sam 2, NFSMW, Oblivion will not run at 1600x1200 on a 7900 gt. You'll have to turn alot of stuff down, and you'll have bad minimun frame rate numbers. Yes, at some parts of a game at that res a 7900 gt will have 70FPS but when it becomes crowded it'll drop to 25 at that res. This is all based on experience. Theres no point in getting a card that will have 100 fps max in a game and then 20 fps when you get into a gunfight. If you plan on that res get a 1900XT.
 

trancemitr

Distinguished
Mar 26, 2006
10
0
18,510
Thanks for all of the suggestions. After thinking more about it, I believe I'm going to try to spend less than I'd originally planned for this system and build another system probably about a year from now. The planned system for now would be more like a AMD 4000+, 1900xt, 2GB RAM, and probably a NF4 Ultra board. I figure I won't worry about SLI or CF for now. Maybe on the next system. Any recommendations for boards? I'll need Firewire for some video stuff. Also any recommendations for RAM. I've been looking at stuff that is 2-3-2-5, but I'm unsure on the brands. Thanks.
 

simplyput

Distinguished
Feb 28, 2006
42
0
18,530
Thanks for all of the suggestions. After thinking more about it, I believe I'm going to try to spend less than I'd originally planned for this system and build another system probably about a year from now. The planned system for now would be more like a AMD 4000+, 1900xt, 2GB RAM, and probably a NF4 Ultra board. I figure I won't worry about SLI or CF for now. Maybe on the next system. Any recommendations for boards? I'll need Firewire for some video stuff. Also any recommendations for RAM. I've been looking at stuff that is 2-3-2-5, but I'm unsure on the brands. Thanks.

Good choice, single core is definately the better choice if you're only doing gaming. I'd recommend a high-end ASUS or DFI lanparty mobo, both are great for gaming/OCing.

As for ram, if you buy name brand (patriot, corsair, ect) you can probably just buy 2-3-3-6 timed sticks and then lower the latencies. Memory manufacurers use the exact same chips for both kinds of sticks, the
2-2-2-5 sticks just perform better in stress testing. That doesn't mean that the other sticks can't hack it though, typically memory is stress tested 5-10% faster than it is rated for.