Sign in with
Sign up | Sign in
Your question

Rate my $2k Extreme Gaming Build

Last response: in Systems
Share
August 15, 2006 3:41:46 PM

This is for running demanding games like Oblivion on a 20" widescreen panel @ 1680*1050 with everything maxed. I'd like to OC the proc some as well and have room for additional HDDs in the future. I think the GPU will hold me till DX10.

1 x LG GSA-H10N DVD+RW 16X8X16 DVD-RW 16X6X16 Dual Layer 10X/6X DVD-RAM 12X Writer 2MB Black OEM W/ SW
1 x OCZ GameXStream 700W ATX12V 24PIN SLI Ready Active PFC ATX Power Supply 120MM Fan Black
1 x ATI Radeon X1900XT 625MHZ 512MB 256BIT 1.45GHZ GDDR3 PCI-E Dual DVI-I VIVO HDTV Video Card
1 x Seagate Barracuda 7200.10 320GB SATA2 3GB/S 7200RPM 16MB Cache NCQ Hard Drive
1 x OCZ Platinum XTC PC2-6400 2GB 2X1GB DDR2-800 CL4-5-4-15 240PIN DIMM Dual Channel Memory Kit
1 x Gigabyte GA-965P-DS3 ATX LGA775 Conroe P965 DDR2 PCI-E16 3PCI-E1 3PCI SATA2 GBLAN Audio Motherboard
1 x Intel Core 2 Duo E6600 Dual Core Processor LGA775 Conroe 2.40GHZ 1066FSB 4MB Retail *Limit 1 / Cust*
1 x Microsoft Windows XP Home Edition OEM
1 x Thermaltake Armor VA8003SWA Silver ATX 11X5.25 6X3.5INT No PS W/ Window And 25CM Fan

Total: $1,999.39 Canadian.
August 15, 2006 4:18:41 PM

While the X1900XT is a great card, I'm afraid it'll have a difficult time running Oblivion maxed at 1680x1050; you'll need Crossfire for that resolution, and even then performance probably won't be perfect.(If you plan on running HDR with antialiasing)
August 15, 2006 5:18:33 PM

Thanks for the response, Heyyou. I've been looking at benchmarks and it seems like I could just squeeze out about 25-30fps (outdoor, of course) with HDR and 2-4x AA and 8-16xAF. I might wait until the x1950xt comes out and see how that performs.

I have pretty much ruled out SLI/xfire as it just is not worth the money, considering even the fastest dual GPU configuration will be outperformed by a newer single card at nominal resolutions.

Regardless, it will be a hell of a lot faster than my 6800u.
Related resources
August 15, 2006 5:22:39 PM

Yeah... Does it piss anyone else off that Oblivion was designed for hardware that doesnt even exist yet?

Update: I currently have 1 PC with Pro and 1 with Home and honestly dont see any real difference for just gaming. I don't know if its worth the 80 bucks or whatever when I plan to upgrade to Vista within 6 months or so.

What practical benefits to I gain with Pro over Home?

Thanks again.

I dropped down to the 600w Gamestream PSU and shaved my price to $1,956 CAD. Not bad for a top end Core 2 rig.
August 15, 2006 5:24:23 PM

I've read that 25fps is acceptable for Oblivion. Anyways, anything over 24fps is smooth. Movies arent shown at 100fps for a reason. The human eye cant detect anything beyond that and if you think you can you are decieving yourself.

As long as it isnt dropping below 25fps, I will be happy.
August 15, 2006 5:27:37 PM

LOL!!!
August 15, 2006 5:41:48 PM

How would 25fps be jumpy as hell? If it were dropping below this, I would agree, but if 25fps is the min frame rate then that is very smooth.

I get about 15-20fps with my 6800 ultra outdoors with most settings on medium to high and it is playable if not completely smooth. The only time it bogs down is when a lot of enemies appear onscreen.

I have seen performance graphs of the x1900xt in Oblivion and rarely does it drop below 25fps even at high settings and resolutions.

I do not need 60fps for Oblivion. Hell, I don't need 60 fps for any game.
August 15, 2006 5:56:20 PM

dunno wot u r talking about, but the difference (to me) between 25fps and 40fps is pretty huge, i shud know since i got a x300, once u get to <25 fps then it really does get choppy.
The 1900xt is a great card, i am planning on getting 1 to, it wont handle oblivion at that res on max.
And where this guy is getting 15 - 25fps as "completley smooth" u might want to stay off the beer?
August 15, 2006 6:02:28 PM

From what I've read 25fps seems to be about the minimum average during intensive situations outdoors. During lulls it is more around 40-45fps which is plenty for me.

Iamnotgod, you should read my last post again. I said 15-20fps is "PLAYABLE IF NOT COMPLETELY SMOOTH." Failing to comprehend a silghtly complex sentence ftl. Also, I don't drink :p 

Anything above 24fps is smooth motion. Film and animation are both screened at 24fps. There is no such thing as a difference between smooth and "super smooth" as at that speed your brain cant notice the lag between frames anyway. The reason people want buffers is so that ie NEVER drops below 24fps even during times of intense action. That is what I desire. Demanding 40+fps constantly is just silly and pointless.

PS: Not to flame, but I highly doubt you guys are playing Oblivion, anway, with your anemic x300 and 7300 GPUs. No offense :p 
August 15, 2006 6:07:31 PM

Quote:
Thanks for the response, Heyyou. I've been looking at benchmarks and it seems like I could just squeeze out about 25-30fps (outdoor, of course) with HDR and 2-4x AA and 8-16xAF. I might wait until the x1950xt comes out and see how that performs.

I have pretty much ruled out SLI/xfire as it just is not worth the money, considering even the fastest dual GPU configuration will be outperformed by a newer single card at nominal resolutions.

Regardless, it will be a hell of a lot faster than my 6800u.


I have to point out that you listed "nominal resolutions", your resolution is not nominal. 1680 x 1050 is going to be pretty hard to get with all the eye candy on with a single card. If you look at Toms interactive chart you'll see that two 6800 ultras in SLI beat one 7800 GTX at 1600 x 1200 (FEAR) if you drop that to 1024 x 768 then the 7800 GTX kills the 6800 ultras by almost double.

So you are right that at lower resolutions a next gen card will beat and old SLI rig at lower resolutions but in your case two 7800GT's will beat one 7900GTX at higher resolutions and I'm going to assume the same is true for crossfire (two x1800xt's will beat one x1900xtx at high resolution).
August 15, 2006 6:14:28 PM

Quote:
If your not waiting for Vist then you may want to go with PRO. Microsoft will be dropping support for Home hear in a couple of months. So if there are any other security issues that are discoved (i'm sure there will be) they will not be releasing an update for it.





WHAT ??????? where did you hear microsfot will drop support for home ?

pro doesnt have alot of good features you should stick to home personally
August 15, 2006 7:10:32 PM

Quote:

I have to point out that you listed "nominal resolutions", your resolution is not nominal. 1680 x 1050 is going to be pretty hard to get with all the eye candy on with a single card. If you look at Toms interactive chart you'll see that two 6800 ultras in SLI beat one 7800 GTX at 1600 x 1200 (FEAR) if you drop that to 1024 x 768 then the 7800 GTX kills the 6800 ultras by almost double.

So you are right that at lower resolutions a next gen card will beat and old SLI rig at lower resolutions but in your case two 7800GT's will beat one 7900GTX at higher resolutions and I'm going to assume the same is true for crossfire (two x1800xt's will beat one x1900xtx at high resolution).


These days 1680*1050 is a nominal resolution. SLI doesnt really come into its own until 1920+ - at least not to the point where i'd be worth it over a single card. That said, I would be willing to consider a 7950 or 1950. I just think 2 discrete GPUs is the epitome of superfluous. Anyone remember Voodoo SLI? Yeah, that lasted long. I plan to upgrade my GPU in less than 6 months anyway.

By the way, the FEAR benchmark is not the most realistic in terms of accross the board SLI performance. It is, BY FAR AND AWAY, the most optimized game for SLI and the one everyone uses to show off their SLI scores. Show me Oblivion benchmarks with the same increase and then I'll be impressed.
August 15, 2006 7:19:41 PM

nice rig, but what is Oblivion?
August 15, 2006 7:20:53 PM

I'm not exactly shure if my advice is any good. BUT. Switch the ram for 2x1GB DDR2 533 some good sticks. Run them dual channel. For a little more cash+the money you saved from the RAM get a 7900GX2. There you have graphics solution that will give you your 24fps min. in Oblivion. I'm not shure about the resol. though.
August 15, 2006 7:33:40 PM

Quote:
I'm not exactly shure if my advice is any good. BUT. Switch the ram for 2x1GB DDR2 533 some good sticks. Run them dual channel. For a little more cash+the money you saved from the RAM get a 7900GX2. There you have graphics solution that will give you your 24fps min. in Oblivion. I'm not shure about the resol. though.


I can get a very good price on the OCZ ram so thats not an issue really.

I will definately consider a 7950gx2, however, I'd like to see how it stacks up vs. ATI x1950 first. Though the x1900xt is only 400 bucks Canadian these days and a very good interim card until DX10.
August 15, 2006 8:24:33 PM

Good for your friend, but I can think of a lot better ways to waste my money.

For a gaming computer, once your past 2 grand, youre entering into the realm of "spending money for the sake of spending money".

More power to him, though. If was dropping 2800 bones on a desktop alone I wouldnt have as much to spend on a shiny new Mac Book Pro.

In the end, I'll have a desktop that runs games at least as well as his and a sexy sleek notebook to tote to class. That seems like a better deal, hmm?
August 15, 2006 8:39:14 PM

He'll be fine with 25fps. It may not be completey smooth but definatly playable. Heck I played unreal at 10fps average, 6min, 20 max. 8O And even that didn't "jump".
August 15, 2006 8:46:36 PM

I'm sure I can make do with the current fastest single-GPU solution (x1900xt(x)) until DX10 games rear their (hopefully gorgeous) head. Even if it means only running 2-4xAA isntead of 8x :p 
August 15, 2006 8:51:36 PM

Quote:
How would 25fps be jumpy as hell? If it were dropping below this, I would agree, but if 25fps is the min frame rate then that is very smooth.

I get about 15-20fps with my 6800 ultra outdoors with most settings on medium to high and it is playable if not completely smooth. The only time it bogs down is when a lot of enemies appear onscreen.

I have seen performance graphs of the x1900xt in Oblivion and rarely does it drop below 25fps even at high settings and resolutions.

I do not need 60fps for Oblivion. Hell, I don't need 60 fps for any game.


The typical no frames dropped is 26-30fps (according to which standard). anything, lower starts to drop frames, lower it goes, the more frames being dropped (hense the "lag look"). As for 60fps, for ONLINE gaming, it allows you better kills (more accurate) than 30fps, it reports the other person's position better. Not playing online games, makes little difference. But, by rule, you don't want a low average fps, since if you get alot of activity on the screen, the fps will drop.
The better build is a cheaper cpu (doesn't help in games) and get a motherboard and SLI (or crossfire) video cards (TWO video cards). Also NOT use LCD monitors (very rare to get a true gaming lcd monitor-check tom's hardware for proof), the true gamer needs the CRT monitor so no "ghosting" and better images.
August 15, 2006 9:16:35 PM

Quote:
This is for running demanding games like Oblivion on a 20" widescreen panel @ 1680*1050 with everything maxed. I'd like to OC the proc some as well and have room for additional HDDs in the future. I think the GPU will hold me till DX10.

1 x LG GSA-H10N DVD+RW 16X8X16 DVD-RW 16X6X16 Dual Layer 10X/6X DVD-RAM 12X Writer 2MB Black OEM W/ SW
1 x OCZ GameXStream 700W ATX12V 24PIN SLI Ready Active PFC ATX Power Supply 120MM Fan Black
1 x ATI Radeon X1900XT 625MHZ 512MB 256BIT 1.45GHZ GDDR3 PCI-E Dual DVI-I VIVO HDTV Video Card
1 x Seagate Barracuda 7200.10 320GB SATA2 3GB/S 7200RPM 16MB Cache NCQ Hard Drive
1 x OCZ Platinum XTC PC2-6400 2GB 2X1GB DDR2-800 CL4-5-4-15 240PIN DIMM Dual Channel Memory Kit
1 x Gigabyte GA-965P-DS3 ATX LGA775 Conroe P965 DDR2 PCI-E16 3PCI-E1 3PCI SATA2 GBLAN Audio Motherboard
1 x Intel Core 2 Duo E6600 Dual Core Processor LGA775 Conroe 2.40GHZ 1066FSB 4MB Retail *Limit 1 / Cust*
1 x Microsoft Windows XP Home Edition OEM
1 x Thermaltake Armor VA8003SWA Silver ATX 11X5.25 6X3.5INT No PS W/ Window And 25CM Fan

Total: $1,999.39 Canadian.


Wow, almost identical to the build I wana get. Except case. I'm going for the Gigabyte Aurora because I want the possibilty of water cooling. Plus it uses 3x 120mm fans, so it's really quiet. Other than that... awesome build. I recommend xp pro though.
August 15, 2006 9:31:49 PM

Quote:
From what I've read 25fps seems to be about the minimum average during intensive situations outdoors. During lulls it is more around 40-45fps which is plenty for me.

Iamnotgod, you should read my last post again. I said 15-20fps is "PLAYABLE IF NOT COMPLETELY SMOOTH." Failing to comprehend a silghtly complex sentence ftl. Also, I don't drink :p 




Don't accept anything below 30 FPS. That is the limit you should stick to. Above 30 FPS, the human eye cannot detect any difference. Trouble is, there will be times and areas of Oblivion that those frames do drop lower, so try a resolution that allows you as close to 60 fps as you can.

On other words, stick with the X1900XT but lower those resolutions so your frames stay above 30 fps and hopefully closer to 60 fps.

hball
August 15, 2006 9:50:05 PM

Yes from what I've read it seems that 1280*1024 is the max res for 100% smooth gameplay with everything MAXED. However, my 20" widescreen LCD's native rez is 1680*1050 and anything less looks...well, less spectacular :p 

Might have to go for a 7950 or 1950 then if I want to game at my panel's native rez.

The xp home vs pro debate rages on. I might just end up copping pro cuz its not exactly a huge cost increase.

As for CRT vs LCD, we can debate that till the cows come home. However, after switching from a relatively high quality 17" CRT to a 20" LCD, I can't see myself going back. Now if those 30" panels would only start coming down in price (coupled with the GPUs capable of pumping out decent frames at 2500x1500), we can start talking about true cinematic gaming :) 

Edit: Also what do you guys think about onboard audo vs. a $100 x-fi music solution? Will I gain any quality/fps from a discrete sound card? I have been hearing a lot about onboard sound lately and just want to know if it will increase sound quality and gaming performance? I don't do anything "sound related" on my computer except listen to iTunes and edit the occasional home video.

Thanks again.
August 15, 2006 10:26:53 PM

You will see an increase in fps using a soundcard, but will it be noticeable? Probably not. But the features of the X-Fi, EAX5 support, great software, plus a bunch of other goodies like 24bit crystalizer. 2.0 upmix (works really well). It is worth getting the X-Fi XM in my opinion. It's what I use with my Z-5500, and I cannot live without now. I don't see justifying the extra money for the front bay control panal. but the rca, coaxil, and optical input/outputs definitely are something to consider (seeing as the X-Fi XM only has analog input/outputs). In short, is it worth it? Yes. Playing Prey with EAX5 is something that can only be expierenced and not dexcribed.
August 15, 2006 11:41:48 PM

Thanks omen. I also have a set of z-5500s so I hope to do them justice with the new set-up.

Now the final question.

Grab the x1900xt or wait a month to evaluate the x1950?
August 16, 2006 4:19:45 AM

Don't get an extrememusic card. Get the auzentech x-polsion if you want the best experience from your z-5500's. They are digital and so is that card unlike the creative. :wink:
August 16, 2006 9:02:51 AM

As far as I know, XP Pro is better at handline multiple CPU cores than home edition?

Anyway, the graphics card (while great, I have one) will definitely be a weak link in a system designed for high resolutions with HDR/AA/AF.

You need some serious (read, crossfire) grunt to demolish Oblivion in this way.

Single card solution? 7950GX2, or wait X1950XTX.

The price is high, but It's certainly the kind of card I would put in that system.
August 16, 2006 9:43:44 AM

Scalable processor support – up to two-way multi-processor support.

XP Home NO

XP Pro YES

In short, anyone that uses dual CPU/multi-core CPU with XP Home is a true dumbass.
August 16, 2006 10:34:16 AM

Wrong wrong wrong wrong WRONG DaveUK! Windows xp home supports dual core processors, what it doesn't support is multiple sockets. You only have to look at task manager to see the load being distributed over the two cores. Please don't insult people if you aren't sure of the facts.

Also, every frame per second counts for smoothness in games. The human eye sees 10 frames per second (or something round about that). If you move your head fast from side to side (disclaimer:D on't do it too much) you will see everything as blurred. This is the exposure time of your eye - as the light changes over a tenth of a second you get a mix of colour and a resulting mix. Despite only updating one tenth of a second, the eye can still see the jump when an image changes fast during gameplay. The smaller the jump (less activity) and more frequent the jump (more frames per second) will make this effect less noticable.

Movies look smooth because the film (and digital receptors) have an exposure time upon capture. This is why fast actions scenes in film have motion blur, even if you look at it frame by frame. It just so happens that at 24fps the movie keeps the blur down. However, camera people are still advised not to make fast camera movements as you get the smeared painting effect that makes people feel sick.

I personally find that over 40 frames per second is fine for first person shooters and over 10 frames per second is fine for strategy games. Everyone can notice the difference between 60 frames per second and 75 frames per second. If you have a CRT just look at the different refresh rates with the side of your eye. You'll see a flickering with both but much more with 60Hz refresh.

Incidentally, LCDs don't have this flickering as the image isn't redrawn each refresh, pixels are only updated that need changing. This is the cause of the ghosting effect that looks a bit like motion blur. I find ghosting makes it really difficult to recognise friend from foe when they are just a blur, it looks a bit more realistic though.
August 16, 2006 10:45:58 AM

Quote:
Don't get an extrememusic card. Get the auzentech x-polsion if you want the best experience from your z-5500's. They are digital and so is that card unlike the creative. Wink


That is also wrong. The Creative XFi extreme music has a digital output, so you can wire up the z5500s via SPDIF.
August 16, 2006 11:33:44 AM

Hey Towley,
Just a suggestion, but why put ALLLL your stuff on one drive? By doing that you are chancing yourself to be ROYALLY SCREWED if you ever have an O/S failure or whatever. I would have 2 drives, one for o/s and the other for back up.

For instance, and you will probably get a MILLION different ways to do this....I have 3 drives
1. a 74 gig Raptor SATA 10K 16mb cache for my apps n o/s
2. a Maxtor SATA II 250 gig 7200 rpm for Mp3s, games, movies.
3. Maxtor SATA II 250 gig 7200 rpm for "MY DOCUMENTS" back ups only. And THIS DRIVE I leave "unplugged". The reason being is that if my machine is ever hacked or trojaned, or virused, I just wipe out my main drive, reformat, and RE - INSTALL my docs from the "UNPLUGGED" drive and I;m back in bizz BABy :wink:

Now you may not need 3 drives but you can CERTAINLY benefit from having 2............Just a thought.

RIG specs
Antec P180 PerformanceSeries Mid-Tower Case
SeaSonic S12 600 watt power supply
Asus A8N32 SLI mobo AMD N-Force 4 SLIX16 (bios 1103 V02.58)
RealTek 97 onboard digital 5.1 Surround
AMD Athlon 64 X2 4800+ Toledo Core, 2 X 1mb L2 cache (AMD driver 1.3.1.0 w/MS hotfix & AMD Dual Core Optimizer)
2 gigs of Corsair TwinX3500LL Pro @ 437Mhz 2-3-2-6-1T
2- BFG 7900 GT OC 256mb in SLI (nvidia driver 91.31)
Western Digital RAPTOR 74.3 gig 10-K rpm HDD for XP & Apps
Maxtor SATA II 250 G HDD for gaming, movies, MP3's
Maxtor SATA II 250 G HDD for document backup (unplugged)
Sony CDrom 52X
Plextor 708-A DVD/CD rom
Logitech Z-5500 Digital 5.1 THX 500watts
August 16, 2006 1:19:54 PM

Hey thanks for all the help guys. I'm thinking I'll wait and see how the 1950 stacks up against the 7950 so I can run my LCD in all its native rez glory. With Oblivion, every frame counts. I'll probably have to wait a few weeks anyway because 6600s are damn hard to find in Canada atm.

I already have a 36 gig raptor in my current pc that I was thinking of popping in the new one. Also I plan to add more drives in the future. I just cant afford to right now. I might also get an external one for portability/backups.
August 16, 2006 11:41:57 PM

Quote:
Don't get an extrememusic card. Get the auzentech x-polsion if you want the best experience from your z-5500's. They are digital and so is that card unlike the creative. Wink


That is also wrong. The Creative XFi extreme music has a digital output, so you can wire up the z5500s via SPDIF.

The card may be called digital but it is an analog source. The auzentech encodes the signal into digital and is the best card for those specific speakers. Trust me we already had a couple threads devoted to this. For someone who really knows ask Choirbass. He knows a lot about audio.
August 17, 2006 10:09:10 AM

Hey Corvette,
So if the Auzentech's are analogue source, are the creative X-Fi cards "TRUE" digital sound cards? I never considered this before. Right now I am using my onboard sound but if I do decide to get a TRUE digital card I want to get a good one for my Z-5500's as well.
Thanks in advance

EDIT: How would one KNOW if a card is a TRUE digital card or not? :?
August 17, 2006 11:22:48 AM

The creative card will still use the z5500s to their full potential. It has a digital IO that gives a true dolby digital/DTS bitstream out. The XFi also outputs stereo MP3s and PCM stereo over the digital interface. This is all that's important for the z5500s.

The only thing the auzentechs do that the creative can't is encode a source which isn't DD/DTS, into one that is. This in my opinion is a bit of a gimmick, and won't benefit stereo signals like mp3s or audio cds. Both the creative and the auzentech card are "TRUE" digital cards.
August 17, 2006 12:10:44 PM

Thanks for the info thorax, I appreciate it. :) 
August 17, 2006 12:12:12 PM

Quote:
1 x Thermaltake Armor VA8003SWA Silver ATX 11X5.25 6X3.5INT No PS W/ Window And 25CM Fan


This is an excellent case - I have a steel Armor without the 25cm fan and it is a beauty. But your rig has one HD and one optical. The Armor is HUGE! If you don't plan to add a bunch of other stuff, you would be better served by something like the Armor Jr. or Lian Li 60B Plus 2. Both offer excellent airflow and loads of capacity for a gameing rig.
August 17, 2006 12:38:10 PM

It's quite funny how people say that anything above 60fps is waste.........what a load of rubbish! It depends on the game in question. For example Oblivion is a real pc buster and you'd have trouble getting decent fps anyway but playing games like first person shooters, high fps makes a huge difference.

I currently play a lot of Call of Duty 2 and normally get a return of close to 200fps with my pc (ranges between 180 and 210) but in a scene with lots of smoke, the fps drops down to 80-90 fps sometimes and this really can lag. You feel and see the difference in gameplay instantly.

Back to the original post though, the rig you've priced up sounds like a good deal. I'd stick with what you've priced up and then look at changing the card next year if your budget won't stretch higher at the moment but for smooth gameplay I feel you should aim a bit higher than the current card now if the cash is available.

I know that people are saying wait until dx10 (lots of threads about that)but the situation is that you are playing a game now where you could do with that extra power and by the time you wait for dx10, you never know you might have got bored with Oblivion and are playing something new. What I'm trying to say is, don't miss the chance to have great fun and enjoyment now rather than regret it later that you never experienced Oblivion to the full.

Goodluck with the build and I hope it works out for you.
August 17, 2006 12:54:35 PM

Quote:
Also, I don't drink :p 


I'm sorry :tongue:
August 17, 2006 5:38:26 PM

Thanks for the feedback guys. I agree the armor is excessive for my current config, but I want to be able to add more hdds, optical drives and cooling in the future. Plus, that 250mm side fan is sweet :) 

To continue beating the dead horse - anything above 60fps MINIMUM is a waste. Hell, anything above 30fps MINIMUM is a waste. Your eye cant percieve the difference. This has been scientifically proven time and time again, and if you disagree you are simply ignoring basic physics.

Moreover, if youre using an LCD, chances are it has a refresh rate of 60hz, so you wont be able to get framerates over 60 anyways unless you disable v-sync which results in ugly frame tearing. But go on and think that 200fps makes your COD experience better, if it helps you sleep at night.

I'm thinking about going for the 7950 to ensure smoothness in Oblivion until the DX10 monsters come out.

Now the question is, to RAID 0 a couple seagates or stick with 1? I hear the performance increase isnt worth it really.
August 17, 2006 8:18:21 PM

The human eye sees at something like 16hz, that translates to like 16 fps. This is why people say high frame rates are moot. Anything above 60 is silly. But if you can tell the difference Im not going to waste time arguing with you.
August 17, 2006 8:50:56 PM

Quote:
The human eye sees at something like 16hz, that translates to like 16 fps. This is why people say high frame rates are moot. Anything above 60 is silly. But if you can tell the difference Im not going to waste time arguing with you.


Thank you.
August 18, 2006 6:25:54 AM

Quote:
The human eye sees at something like 16hz, that translates to like 16 fps. This is why people say high frame rates are moot. Anything above 60 is silly. But if you can tell the difference Im not going to waste time arguing with you.


Thank you.

Maybe - but if you guys have the game CoD2, give me a mail address and I'll send you a demo of me playing if you like and you will be able to see the effects. The effects have absolutely nothing to do with connection, it's pure fps lag.

Maybe in other games this won't matter but in CoD2 it has a massive effect.

Oh and by the way, why do a lot of benchmarks quote fps rates for different cards and games if this doesn't make a difference.......................are they trying to blind us by thinking more is better? No, they are telling the truth.

If there is no difference in gameplay after 60fps, then we might aswell all stay on gf4 ti series or something. I know visual quality has an effect on what card we buy but fps is also a big issue.
August 18, 2006 6:59:43 AM

Quote:
I currently play a lot of Call of Duty 2 and normally get a return of close to 200fps with my pc (ranges between 180 and 210) but in a scene with lots of smoke, the fps drops down to 80-90 fps sometimes and this really can lag. You feel and see the difference in gameplay instantly.

Of course its going to seem to lag thats a 100fps drop, but if you were constantly playing at 80-90fps it wouldnt seem like there was any lag at all. Btw thats a pretty nice framerate there for COD2, what are ur specs?
August 18, 2006 7:03:49 AM

Quote:
I currently play a lot of Call of Duty 2 and normally get a return of close to 200fps with my pc (ranges between 180 and 210) but in a scene with lots of smoke, the fps drops down to 80-90 fps sometimes and this really can lag. You feel and see the difference in gameplay instantly.

Of course its going to seem to lag thats a 100fps drop, but if you were constantly playing at 80-90fps it wouldnt seem like there was any lag at all. Btw thats a pretty nice framerate there for COD2, what are ur specs?

I suppose you are right but what the others are trying to say is that there is no difference where 100% there is a massive difference in gameplay.

PC Specs are in the signature btw but to add that I have tweaked and tuned the system to get it's full potential plus chose the option in CoD2 to play in dx7 mode rather than dx9 (massive increase in speed but not as nice to look at)
August 18, 2006 12:55:44 PM

After reading an article on www.100fps.com , it's a bit easier to understand how the human eye percieves motion. The whole "you don't need anything above 20fps" debate will still go on, but it appears that the human eye is capable of capturing very quick changes (about 1/500th second), and that video hardware is designed to convey information less frequently, but giving the illusion of smooth motion. This is what interlaced video is all about, and the method used to make a relatively slow frame rate (25fps for PAL) look smooth.

The downside of modern PC monitors, is that they convey video in a progressive format, which although looks much better quality, will look stuttery if the frame rate drops below 50fps. I'm not sure where microgiant got the 16fps figure about the human eye; whether it's right or wrong, playing a game at 16fps will look stuttery, a glorified slideshow as THG would put it. This obviously does not matter if you're playing slow strategy games with little movement, but for shooters and racing games, the frame rate has a huge effect on how smooth the play is and whether you win or lose in some cases.
The optimum frame rate will be the refresh rate of your monitor.
August 18, 2006 6:33:48 PM

Quote:


Oh and by the way, why do a lot of benchmarks quote fps rates for different cards and games if this doesn't make a difference.......................are they trying to blind us by thinking more is better? No, they are telling the truth.

If there is no difference in gameplay after 60fps, then we might aswell all stay on gf4 ti series or something. I know visual quality has an effect on what card we buy but fps is also a big issue.


Not quite right, I'm afraid.

As I said before, benchmarks show you what your hardware is CAPABLE of. Just because you can get 200 fps in Quake 4 doesnt mean you'll benefit from more than 30. It just shows that you might have a chance at playable framerates when Quake 5 comes out.

Oh, and good luck playing Oblivion/COD/HL2 on a geforce 4....moron.
August 18, 2006 7:58:13 PM

Quote:

I currently play a lot of Call of Duty 2 and normally get a return of close to 200fps with my pc (ranges between 180 and 210) but in a scene with lots of smoke, the fps drops down to 80-90 fps sometimes and this really can lag. You feel and see the difference in gameplay instantly.


You do NOT get anywhere close to 200fps. You can now stop stroking your e-wang. We know you are a bold faced liar.

Please, anyone who believes the load of rubbish this guy it dishing out - you need your head examined.
August 20, 2006 7:22:49 AM

Quote:
chose the option in CoD2 to play in dx7 mode rather than dx9 (massive increase in speed but not as nice to look at)

I did that on my 9800 pro system and I can run full graphics and still get around 40fps. As soon as i hit dx9 rendering, that was it. No amount of graphics/resolution dropping could get me above 25fps standing still :(  . And yea it does look pretty bad on dx7 but im more of a performance rather than quality person.
August 21, 2006 8:09:52 AM

Quote:

I currently play a lot of Call of Duty 2 and normally get a return of close to 200fps with my pc (ranges between 180 and 210) but in a scene with lots of smoke, the fps drops down to 80-90 fps sometimes and this really can lag. You feel and see the difference in gameplay instantly.


You do NOT get anywhere close to 200fps. You can now stop stroking your e-wang. We know you are a bold faced liar.

Please, anyone who believes the load of rubbish this guy it dishing out - you need your head examined.

Really buddy well, do you want some screenshots???

Before spouting of your comments like a little kid maybe you should have read my post a bit clearer. CoD2 can be run in 2 different modes with the majority of clan players like myself opting to play the game in dx7 which makes the game run faster and makes you more competitive. Making my fps about 200 average.

So go home and grab your teddies pal because it looks like you'll be throwing them out of ya cot pretty soon!!

Oh, just seen that you're in tech support..........well you guys often get things wrong and then have to rely on people in my job as an IT systems manager to sort ya shite out.
!