Rate my $2k Extreme Gaming Build

towely

Distinguished
Aug 15, 2006
250
0
18,780
This is for running demanding games like Oblivion on a 20" widescreen panel @ 1680*1050 with everything maxed. I'd like to OC the proc some as well and have room for additional HDDs in the future. I think the GPU will hold me till DX10.

1 x LG GSA-H10N DVD+RW 16X8X16 DVD-RW 16X6X16 Dual Layer 10X/6X DVD-RAM 12X Writer 2MB Black OEM W/ SW
1 x OCZ GameXStream 700W ATX12V 24PIN SLI Ready Active PFC ATX Power Supply 120MM Fan Black
1 x ATI Radeon X1900XT 625MHZ 512MB 256BIT 1.45GHZ GDDR3 PCI-E Dual DVI-I VIVO HDTV Video Card
1 x Seagate Barracuda 7200.10 320GB SATA2 3GB/S 7200RPM 16MB Cache NCQ Hard Drive
1 x OCZ Platinum XTC PC2-6400 2GB 2X1GB DDR2-800 CL4-5-4-15 240PIN DIMM Dual Channel Memory Kit
1 x Gigabyte GA-965P-DS3 ATX LGA775 Conroe P965 DDR2 PCI-E16 3PCI-E1 3PCI SATA2 GBLAN Audio Motherboard
1 x Intel Core 2 Duo E6600 Dual Core Processor LGA775 Conroe 2.40GHZ 1066FSB 4MB Retail *Limit 1 / Cust*
1 x Microsoft Windows XP Home Edition OEM
1 x Thermaltake Armor VA8003SWA Silver ATX 11X5.25 6X3.5INT No PS W/ Window And 25CM Fan

Total: $1,999.39 Canadian.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
While the X1900XT is a great card, I'm afraid it'll have a difficult time running Oblivion maxed at 1680x1050; you'll need Crossfire for that resolution, and even then performance probably won't be perfect.(If you plan on running HDR with antialiasing)
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
Thanks for the response, Heyyou. I've been looking at benchmarks and it seems like I could just squeeze out about 25-30fps (outdoor, of course) with HDR and 2-4x AA and 8-16xAF. I might wait until the x1950xt comes out and see how that performs.

I have pretty much ruled out SLI/xfire as it just is not worth the money, considering even the fastest dual GPU configuration will be outperformed by a newer single card at nominal resolutions.

Regardless, it will be a hell of a lot faster than my 6800u.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
Yeah... Does it piss anyone else off that Oblivion was designed for hardware that doesnt even exist yet?

Update: I currently have 1 PC with Pro and 1 with Home and honestly dont see any real difference for just gaming. I don't know if its worth the 80 bucks or whatever when I plan to upgrade to Vista within 6 months or so.

What practical benefits to I gain with Pro over Home?

Thanks again.

I dropped down to the 600w Gamestream PSU and shaved my price to $1,956 CAD. Not bad for a top end Core 2 rig.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
I've read that 25fps is acceptable for Oblivion. Anyways, anything over 24fps is smooth. Movies arent shown at 100fps for a reason. The human eye cant detect anything beyond that and if you think you can you are decieving yourself.

As long as it isnt dropping below 25fps, I will be happy.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
How would 25fps be jumpy as hell? If it were dropping below this, I would agree, but if 25fps is the min frame rate then that is very smooth.

I get about 15-20fps with my 6800 ultra outdoors with most settings on medium to high and it is playable if not completely smooth. The only time it bogs down is when a lot of enemies appear onscreen.

I have seen performance graphs of the x1900xt in Oblivion and rarely does it drop below 25fps even at high settings and resolutions.

I do not need 60fps for Oblivion. Hell, I don't need 60 fps for any game.
 

i_am_not_god

Distinguished
Feb 21, 2005
90
0
18,630
dunno wot u r talking about, but the difference (to me) between 25fps and 40fps is pretty huge, i shud know since i got a x300, once u get to <25 fps then it really does get choppy.
The 1900xt is a great card, i am planning on getting 1 to, it wont handle oblivion at that res on max.
And where this guy is getting 15 - 25fps as "completley smooth" u might want to stay off the beer?
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
From what I've read 25fps seems to be about the minimum average during intensive situations outdoors. During lulls it is more around 40-45fps which is plenty for me.

Iamnotgod, you should read my last post again. I said 15-20fps is "PLAYABLE IF NOT COMPLETELY SMOOTH." Failing to comprehend a silghtly complex sentence ftl. Also, I don't drink :p

Anything above 24fps is smooth motion. Film and animation are both screened at 24fps. There is no such thing as a difference between smooth and "super smooth" as at that speed your brain cant notice the lag between frames anyway. The reason people want buffers is so that ie NEVER drops below 24fps even during times of intense action. That is what I desire. Demanding 40+fps constantly is just silly and pointless.

PS: Not to flame, but I highly doubt you guys are playing Oblivion, anway, with your anemic x300 and 7300 GPUs. No offense :p
 

waylander

Distinguished
Nov 23, 2004
1,649
0
19,790
Thanks for the response, Heyyou. I've been looking at benchmarks and it seems like I could just squeeze out about 25-30fps (outdoor, of course) with HDR and 2-4x AA and 8-16xAF. I might wait until the x1950xt comes out and see how that performs.

I have pretty much ruled out SLI/xfire as it just is not worth the money, considering even the fastest dual GPU configuration will be outperformed by a newer single card at nominal resolutions.

Regardless, it will be a hell of a lot faster than my 6800u.

I have to point out that you listed "nominal resolutions", your resolution is not nominal. 1680 x 1050 is going to be pretty hard to get with all the eye candy on with a single card. If you look at Toms interactive chart you'll see that two 6800 ultras in SLI beat one 7800 GTX at 1600 x 1200 (FEAR) if you drop that to 1024 x 768 then the 7800 GTX kills the 6800 ultras by almost double.

So you are right that at lower resolutions a next gen card will beat and old SLI rig at lower resolutions but in your case two 7800GT's will beat one 7900GTX at higher resolutions and I'm going to assume the same is true for crossfire (two x1800xt's will beat one x1900xtx at high resolution).
 

Gadzookie

Distinguished
Aug 2, 2006
32
0
18,530
If your not waiting for Vist then you may want to go with PRO. Microsoft will be dropping support for Home hear in a couple of months. So if there are any other security issues that are discoved (i'm sure there will be) they will not be releasing an update for it.




WHAT ??????? where did you hear microsfot will drop support for home ?

pro doesnt have alot of good features you should stick to home personally
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
I have to point out that you listed "nominal resolutions", your resolution is not nominal. 1680 x 1050 is going to be pretty hard to get with all the eye candy on with a single card. If you look at Toms interactive chart you'll see that two 6800 ultras in SLI beat one 7800 GTX at 1600 x 1200 (FEAR) if you drop that to 1024 x 768 then the 7800 GTX kills the 6800 ultras by almost double.

So you are right that at lower resolutions a next gen card will beat and old SLI rig at lower resolutions but in your case two 7800GT's will beat one 7900GTX at higher resolutions and I'm going to assume the same is true for crossfire (two x1800xt's will beat one x1900xtx at high resolution).

These days 1680*1050 is a nominal resolution. SLI doesnt really come into its own until 1920+ - at least not to the point where i'd be worth it over a single card. That said, I would be willing to consider a 7950 or 1950. I just think 2 discrete GPUs is the epitome of superfluous. Anyone remember Voodoo SLI? Yeah, that lasted long. I plan to upgrade my GPU in less than 6 months anyway.

By the way, the FEAR benchmark is not the most realistic in terms of accross the board SLI performance. It is, BY FAR AND AWAY, the most optimized game for SLI and the one everyone uses to show off their SLI scores. Show me Oblivion benchmarks with the same increase and then I'll be impressed.
 

sepuko

Distinguished
Dec 13, 2005
224
32
18,710
I'm not exactly shure if my advice is any good. BUT. Switch the ram for 2x1GB DDR2 533 some good sticks. Run them dual channel. For a little more cash+the money you saved from the RAM get a 7900GX2. There you have graphics solution that will give you your 24fps min. in Oblivion. I'm not shure about the resol. though.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
I'm not exactly shure if my advice is any good. BUT. Switch the ram for 2x1GB DDR2 533 some good sticks. Run them dual channel. For a little more cash+the money you saved from the RAM get a 7900GX2. There you have graphics solution that will give you your 24fps min. in Oblivion. I'm not shure about the resol. though.

I can get a very good price on the OCZ ram so thats not an issue really.

I will definately consider a 7950gx2, however, I'd like to see how it stacks up vs. ATI x1950 first. Though the x1900xt is only 400 bucks Canadian these days and a very good interim card until DX10.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
Good for your friend, but I can think of a lot better ways to waste my money.

For a gaming computer, once your past 2 grand, youre entering into the realm of "spending money for the sake of spending money".

More power to him, though. If was dropping 2800 bones on a desktop alone I wouldnt have as much to spend on a shiny new Mac Book Pro.

In the end, I'll have a desktop that runs games at least as well as his and a sexy sleek notebook to tote to class. That seems like a better deal, hmm?
 

corvetteguy

Distinguished
Jan 15, 2006
1,545
0
19,780
He'll be fine with 25fps. It may not be completey smooth but definatly playable. Heck I played unreal at 10fps average, 6min, 20 max. 8O And even that didn't "jump".
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
I'm sure I can make do with the current fastest single-GPU solution (x1900xt(x)) until DX10 games rear their (hopefully gorgeous) head. Even if it means only running 2-4xAA isntead of 8x :p
 
How would 25fps be jumpy as hell? If it were dropping below this, I would agree, but if 25fps is the min frame rate then that is very smooth.

I get about 15-20fps with my 6800 ultra outdoors with most settings on medium to high and it is playable if not completely smooth. The only time it bogs down is when a lot of enemies appear onscreen.

I have seen performance graphs of the x1900xt in Oblivion and rarely does it drop below 25fps even at high settings and resolutions.

I do not need 60fps for Oblivion. Hell, I don't need 60 fps for any game.

The typical no frames dropped is 26-30fps (according to which standard). anything, lower starts to drop frames, lower it goes, the more frames being dropped (hense the "lag look"). As for 60fps, for ONLINE gaming, it allows you better kills (more accurate) than 30fps, it reports the other person's position better. Not playing online games, makes little difference. But, by rule, you don't want a low average fps, since if you get alot of activity on the screen, the fps will drop.
The better build is a cheaper cpu (doesn't help in games) and get a motherboard and SLI (or crossfire) video cards (TWO video cards). Also NOT use LCD monitors (very rare to get a true gaming lcd monitor-check tom's hardware for proof), the true gamer needs the CRT monitor so no "ghosting" and better images.
 

f1nal_0men

Distinguished
Feb 26, 2006
367
0
18,780
This is for running demanding games like Oblivion on a 20" widescreen panel @ 1680*1050 with everything maxed. I'd like to OC the proc some as well and have room for additional HDDs in the future. I think the GPU will hold me till DX10.

1 x LG GSA-H10N DVD+RW 16X8X16 DVD-RW 16X6X16 Dual Layer 10X/6X DVD-RAM 12X Writer 2MB Black OEM W/ SW
1 x OCZ GameXStream 700W ATX12V 24PIN SLI Ready Active PFC ATX Power Supply 120MM Fan Black
1 x ATI Radeon X1900XT 625MHZ 512MB 256BIT 1.45GHZ GDDR3 PCI-E Dual DVI-I VIVO HDTV Video Card
1 x Seagate Barracuda 7200.10 320GB SATA2 3GB/S 7200RPM 16MB Cache NCQ Hard Drive
1 x OCZ Platinum XTC PC2-6400 2GB 2X1GB DDR2-800 CL4-5-4-15 240PIN DIMM Dual Channel Memory Kit
1 x Gigabyte GA-965P-DS3 ATX LGA775 Conroe P965 DDR2 PCI-E16 3PCI-E1 3PCI SATA2 GBLAN Audio Motherboard
1 x Intel Core 2 Duo E6600 Dual Core Processor LGA775 Conroe 2.40GHZ 1066FSB 4MB Retail *Limit 1 / Cust*
1 x Microsoft Windows XP Home Edition OEM
1 x Thermaltake Armor VA8003SWA Silver ATX 11X5.25 6X3.5INT No PS W/ Window And 25CM Fan

Total: $1,999.39 Canadian.

Wow, almost identical to the build I wana get. Except case. I'm going for the Gigabyte Aurora because I want the possibilty of water cooling. Plus it uses 3x 120mm fans, so it's really quiet. Other than that... awesome build. I recommend xp pro though.
 

hball

Distinguished
Jun 10, 2006
86
0
18,630
From what I've read 25fps seems to be about the minimum average during intensive situations outdoors. During lulls it is more around 40-45fps which is plenty for me.

Iamnotgod, you should read my last post again. I said 15-20fps is "PLAYABLE IF NOT COMPLETELY SMOOTH." Failing to comprehend a silghtly complex sentence ftl. Also, I don't drink :p


Don't accept anything below 30 FPS. That is the limit you should stick to. Above 30 FPS, the human eye cannot detect any difference. Trouble is, there will be times and areas of Oblivion that those frames do drop lower, so try a resolution that allows you as close to 60 fps as you can.

On other words, stick with the X1900XT but lower those resolutions so your frames stay above 30 fps and hopefully closer to 60 fps.

hball
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
Yes from what I've read it seems that 1280*1024 is the max res for 100% smooth gameplay with everything MAXED. However, my 20" widescreen LCD's native rez is 1680*1050 and anything less looks...well, less spectacular :p

Might have to go for a 7950 or 1950 then if I want to game at my panel's native rez.

The xp home vs pro debate rages on. I might just end up copping pro cuz its not exactly a huge cost increase.

As for CRT vs LCD, we can debate that till the cows come home. However, after switching from a relatively high quality 17" CRT to a 20" LCD, I can't see myself going back. Now if those 30" panels would only start coming down in price (coupled with the GPUs capable of pumping out decent frames at 2500x1500), we can start talking about true cinematic gaming :)

Edit: Also what do you guys think about onboard audo vs. a $100 x-fi music solution? Will I gain any quality/fps from a discrete sound card? I have been hearing a lot about onboard sound lately and just want to know if it will increase sound quality and gaming performance? I don't do anything "sound related" on my computer except listen to iTunes and edit the occasional home video.

Thanks again.
 

f1nal_0men

Distinguished
Feb 26, 2006
367
0
18,780
You will see an increase in fps using a soundcard, but will it be noticeable? Probably not. But the features of the X-Fi, EAX5 support, great software, plus a bunch of other goodies like 24bit crystalizer. 2.0 upmix (works really well). It is worth getting the X-Fi XM in my opinion. It's what I use with my Z-5500, and I cannot live without now. I don't see justifying the extra money for the front bay control panal. but the rca, coaxil, and optical input/outputs definitely are something to consider (seeing as the X-Fi XM only has analog input/outputs). In short, is it worth it? Yes. Playing Prey with EAX5 is something that can only be expierenced and not dexcribed.
 

towely

Distinguished
Aug 15, 2006
250
0
18,780
Thanks omen. I also have a set of z-5500s so I hope to do them justice with the new set-up.

Now the final question.

Grab the x1900xt or wait a month to evaluate the x1950?