Strange performance increase!!!

G

Guest

Guest
The following system experience a performance increase from around 60fps to 110fps under the following conditions.

Duron 650 (o/c 900 1.88V +/-45c)
Asus A7V, PC133 7.5ns CL2-2-2, Maxtor15GB ATA100 7200rpm
Creative Annihilator Pro (Geforce DDR 300Mhz, o/c 330Mhz)
Wait for Sync OFF, Render ahead 7 frames.
Blaster Control Cache ram 32MB (KEY to performance increase)
Performance did not increase if cache was 63MB or 15MB
Creative driver version 4.23
1280*1024, 16bit, all high

Surprisingly Nvidia's Detanator3 driver does not produce as much performance increase. Only gave 60fps to 80fps, no matter what PCI texture buffer.

It appears that Nvidia PCI texture buffer is not Creative Cache ram???

game used was unreal tournament

Best regards
cx5
 

smn198

Distinguished
Dec 31, 2007
179
0
18,680
The reason for that is wait for sync OFF.

What normally happens (when it is on) is the graphics card only updates at the same speed as the screen (60hz of fps). For benchmarking purposes, you can turn sync off to show how fast it could go. You will not see any difference.

Go back to your original setup but keep wait for sync OFF and see how much difference the other changes made.
 
G

Guest

Guest
Wait, wait. Do you mean that the incredible 150+ fps quoted in the reviews cannot be seen by the gamer? BTW, are there monitors that can refresh at 150Hz even at the lowest resolution?

Then what is the point of having frame rates higher than what can be supported by monitors -- besides bragging rights? I'm not trying to start a flame war. I'm just trying to determine the sweet spot of current graphics cards so that my hard-earned dollar is spent on something that acutally makes a difference that I can see.

OK, perhaps one can argue that a card that can run today's games at 150+ fps will be able to run next year's games at 80 fps. But one can also aruge that it's better to spend one's money on a card that's one or two interations behind the bleeding edge and wait a year to buy today's bleeding-edge card at a huge discount. It may even turn out that today's not-so-leading-edge card can run next year's games just fine. Then you have just saved yourself $150.
 

smn198

Distinguished
Dec 31, 2007
179
0
18,680
"Wait, wait. Do you mean that the incredible 150+ fps quoted in the reviews cannot be seen by the gamer?"

As far as I am aware, when you play a game the refresh rate of the monitor is 60fps (at least this is my experience). If you have a monitor which displays the refresh rate try it out. If this is the case than anything above 60fps is discarded.


"Then what is the point of having frame rates higher than what can be supported by monitors"

For 2D. Less screen flicker. In games you don't notice it so much as they are generally darker than the white background in apps such as word or IE.

Its always interesting trying to work out if it is worth paying for a pice of hardware which is not able to be used by current software; just look at the P4 contaversy.
 

JoeHead

Distinguished
Dec 31, 2007
366
0
18,780
OK all you guys worried about the refresh of the monitor. You better go and have a talk with your maker. He only made our eyes capable of seeing at approx. 60 Hz.

So lets "see" now. Anything above that is worthless. What it does mean is that you can run at high resolutions and more eye candy. Oh and yes tomorrows games will run well. (well if compatable)

So forget about the stinking monitor but TALK WITH THE MAKER!!!

<b> Fragg at will!!! </b>
 

Kodiak

Distinguished
Dec 31, 2007
632
0
18,980
I think you're mixing up frame rates and refresh rates...
high refresh rates are better for the eyes -- regardless of frame rates. If you have frame rate of 5fps and 85HZ refresh rates, your eyes will be better of (because of nice & high refresh rates on the monitor, consistently renewing the picture on the screen), but your game scene will seem choppy, since it will only *change* 5 times per second (your video card will only calculate new frame 5 times per second)
So, higher refresh rates are always better (until they get ridiculous), frame rates are only relevant up to the level of refresh rates.

If your frame rate is higher than refresh rate, it doesn't matter by how much, all the rest is discarded -- i.e. 65FPS and 150FPS will seem the same on a 60HZ monitor...
but 50FPS will flow better than 20FPS on a 60Hz monitor... (although they are arguably more or less equally good/bad for the eyes, since refresh rate is the same)
Hope that helps and doesn't confuse further...
 
G

Guest

Guest
Remember one thing though, 100FPS might do a difference on a 60Hz monitor. Most reviews show Average frame rates, so
the game might sometime go down to 50fps or lower. So average frame rates of 100 or more isnt completely worthless.

Just thought id remind you.
//warhawk
 

Kodiak

Distinguished
Dec 31, 2007
632
0
18,980
true enough... I was talking conceptually (i.e. true 100FPS rather than average 100FPS), but obviously, having FPS>Hz at any given time is ideal:)
 

smn198

Distinguished
Dec 31, 2007
179
0
18,680
Whoops. I used fps instead of Hz

"rate of the monitor is 60fps"
should have been "rate of the monitor is 60Hz"

What I ment is that in a 3D game, in my experience, the monitor refreshes 60 times per second regardless of the fps supplied by the graphics card. By disabling sycronisation between the graphics card and the monitor and increacing the fps to above 60, then the extra frames are not displayed. Therefore anything abouve 60fps in purely academic.
 
G

Guest

Guest
I can't stand any refresh rate below 75. Look at a white screen at 60hz. Drives me nuts. Some people recommend 85hz. Also remember that a game my have a high frame rate when not much is happening but throw a bunch of polygons (gibs, rockets, players, etc.) on the screen and watch it bog down. I hate having the frame rate go into the toilet when everything is going nuts onscreen.

Welcome back my friends to the show that never ends...
 
G

Guest

Guest
Well u are absolutely right....
I hate to see my monitor in 60 Hz refresh rate.
Really strain my eyes, and that's true it's better to use up the 85 Hz refresh rate.
 
G

Guest

Guest
As some one has pointed out true fps is better than average fps.

So my 110fps, run all over the place as the game progress between 60fps to 140fps.

Again you want really high game fps because when the thing you are playing with in the games all runs at you and shoot everything they've got at you = alot of graphics action, then that is when your high fps comes to save you - by not jerking and allowing you to see everything and respond by running away instead of the screen jerk, you can't see what's going on and - DIE :(

Best regards
cx5
 

toonces

Distinguished
Dec 31, 2007
213
0
18,680
of course, we ought not forget that the human eye can only see/recognize around 32-35 fps.

<font color=red>booyah, grandma, booyah..</font color=red>
 
G

Guest

Guest
Actually, to state that better you would need to say that motion looks smooth at about 32-35 fps.

Also, to all of you wondering why 60Hz is a strain on your eyes...
It's because of the biology of the eyeball. The eyeball consists of different cells that enable us to see, namingly the rods and the cones. Cones allow us to see in color and rods only see in black and white. The reason we have cones is obvious and interestingly enough, they are accumulated near the center of the eye. Rods you will find more at the edges inside the eye. What's the reason? The black and white rods can sense movement much easier than cones. It all goes back to when we were hunters and gatherers. The rods at the edges of our eyes are very useful for seeing movement in our peripheral vision. Hence greater chance of survival... When looking straight at your screen 60Hz will not seem as bad as when you look at it out of the corner of your eye. Go ahead, try it! You'll see what I mean.

Strange but true. When you get up into the 75Hz or 85Hz range you are exceeding the "refresh rate" of the rod cells in your eyes and hence see no screen refresh. Did you know that dogs only see in black and white because all they have are rod cells? You have to admit that that would be useful for hunting in the dark and chasing all those squirrels like my dog does.

- Every private citizen has a public responsibility<P ID="edit"><FONT SIZE=-1><EM>Edited by wheeljawk on 12/08/00 11:23 PM.</EM></FONT></P>
 

toonces

Distinguished
Dec 31, 2007
213
0
18,680
yes, but i also meant to say what i did. your eye can't tell the difference between 40 fps. and 140 fps. it all looks like smooth motion to the normal eye.

<font color=red>booyah, grandma, booyah..</font color=red>
 

JoeHead

Distinguished
Dec 31, 2007
366
0
18,780
Sorry guys but your eyes see aprox. 60 fps. And yes it is best for your eyes with your monitor at 75 Hz or better.

So now 60 fps or 300 fps, your eyes don't care. I hate to go below 60. Can't stand any jerky movement, especially in FPS. Frag or be Fragged. Can't be engaging in combat with the jitters!!!!

<b> Fragg at will!!! </b>
 
G

Guest

Guest
yes, try look at the monitor @100hz for about 5mins. and switch to 60hz and you know the difference.

<font color=orange>What do you think? :wink: </font color=orange>
 

toonces

Distinguished
Dec 31, 2007
213
0
18,680
i just looked across the net for a real "scientific" answer about recognizable fps number. most of the glossaries that have "fps" in them state that the highest recognizeable rate is low 30's. other 3d gaming sites state various answers from 30 up to 75. what i did seem to notice is that the lowest fps needed to be recognized as motion is 25ish. that's the fps for movie theatre projectors. i've also noticed that some sites stated that you get "motion blur" at fps over 35ish. to me that would explain why you say you can "see" a difference between 30 and 60 fps, but "they" say that you can't "recognize" over 32-35 fps. you can see a blur, and that blur may make the scene look more realistic in your perception at 60 fps. but you can't recognize what is being blurred. when you drive down the highway you can "see" everything, but you don't "recognize" everything you see because most of it is lost in motion blur. for example, say you're driving and you close one eye. this will eliminate the blurring from focusing on a certain distance. then you look down the road as you drive. the things far away are mostly crisp and "in focus". the things closer and closer are more and more blurred. this is because your brain blends your perceptions/images into a blurred continuum. aka motion blur.

this is why i take issue with the yahoo's who talk about doing 180 degree spins and your video card which puts out 30 fps can't render the whole spin. guess what, you couldn't tell what was on the screen even if your card could render everything at 90fps. as a demonstration look over your right shoulder and then in the time span of 1 second turn your head 180 degrees to your left shoulder. i'll tell you what you saw, nothing but a big blur. that's what you'll see on your 90fps turn also.

my final comment on this whole issue, the only legitimate reason to look at fps is to determine whether or not you're happy with the way your system is operating.

<font color=red>booyah, grandma, booyah..</font color=red>