Sign in with
Sign up | Sign in
Your question

CPU vs FPS

Last response: in CPUs
Share
January 5, 2012 2:20:22 AM

I just read a post somewhere where this guy claims that running a game at over 60 or 75 fps is a "waste of CPU power" and by "power" I am assuming he is talking about CPU horsepower.

This doesn't sound right to me, correct me if I am wrong. But to me it would seem like a game is going to run as fast as it can, all things being equal, based on your CPU frequency and anything past that is down to video card. Understanding that anything over 60 or 75 fps isn't really going to make a noticeable difference in gameplay but is running a game at 100 fps going to cause other parts of the program to be starved for CPU cycles more so then running it at 60 or 75 fps? I've never seen or heard of a game running slower at 100fps then at 60, 75 or even 30fps but if it did I would assume it has something to do with poor vsync implementation or bad coding. Anyone please throw in your 2 cents.

More about : cpu fps

a b à CPUs
January 5, 2012 2:35:35 AM

pal is 25 fps...ntsc is 29fps

losless HD is concidered 60fps

above 30 i really cant tell the difference, i miss halo 1, you lock the frames in at 30 and you know you have all that extra power for the detailed scenes instead of a dark scene running at 100fps and a detailed scene going down to 25
m
0
l
January 5, 2012 2:52:48 AM

Thanks for the reply but I'm not really sure that answers the question. The main question is does running things at higher FPS use MORE CPU power then running them at lower FPS. I know it probably seems like an incredibly dumb question. I'm just looking for a sanity check here. I mean the amount of CPU allocated for rendering (most of which I would figure would be offloaded with a modern GPU) isn't going to change based on your FPS is it? And would it actually rob horsepower from the rest of the program?
m
0
l
Related resources
a b à CPUs
January 5, 2012 3:14:07 AM

I think it will use more CPU power, yes. This isn't a bad thing, though. The CPU will handle its duties relevant to the AI, targeting, movement and whatever with a higher priority than it'll give to directing rendering. FPS is generally a measurement of how "fast" a game runs - 100fps can't run "slower" than 60fps.
Incidentally, your monitor probably can't show more than 60fps, so anything over that could be considered a waste. You might as well just turn on vsync.
m
0
l
January 5, 2012 3:30:35 AM

Anything beyond the refresh rate (60hz = 60fps, 120hz = 120fps) is completely wasted. A 60hz refresh display has an absolute maximum display capability of 60 frames per second. It completely disregards every frame beyond that.
m
0
l
a b à CPUs
January 5, 2012 3:32:13 AM

^^ Unless you have a 240Hz CRT :D 
m
0
l
a c 471 à CPUs
January 5, 2012 3:39:41 AM

NTSC = 29.976 frames per second.

Anywaste, to try to answer the question at hand...........

Most games are limited by the GPU not the CPU. Yes, having a faster CPU for a GPU intensive game can help improve FPS by a few frames (single digits), depending on the game. Hardcore gamers wants as high FPS as possible so that the minimum FPS does not dip too low. The bigger the difference between the average frame rate and the lowest frame rate the more obvious the stuttering can be.

The human eye can see more frames per second than what a movie or TV show typically display. Film costs money so the more frames used per second the more film you need to buy and that increases costs. I would say that the average person can see up to 60 FPS. However, in a life and death situation when you have massive amounts of adrenaline running through you, you can see more than that since people have reported experiencing "bullet time" when their lives are flashing in front of them.

The CPU will not be starving at high frame rate because the GPU is doing all the graphics processing. If the CPU is slow enough to bottleneck the GPU, then that means that there are a lot of "background" game processing to be done before the GPU can processes more graphic frames. Kinda like pairing up a Pentium D 940 with a Radeon HD 6990. In this case the Pentium 4 D 940 is so slow that it is "robbing" your Radeon HD 6990 of some performance.

I mentioned earlier that a faster CPU can improve FPS in a GPU intensive game, but that is not always the case. If your current CPU is fast enough to process all the "background" game processes before the GPU has to slow down an wait for the CPU, then having a faster CPU will not improve your overall performance by much, if any.

In the end if you are building a PC only for games, then you want to balance you CPU with your GPU so that neither one is "robbing" performance from the other.

Me? I prefer getting the most powerful CPU I can afford 'cause I encode my movies and video encoding requires a lot of processing power. I play games, but I don't consider myself a hardcore gamer. If I had the money I would get a 6-core Sandy Bridge-E i7-3930k (overkill for games since very few games can use4 cores, and those that can use that 4th core, do not use it efficiently) and simply continue to use my Radeon HD 5850.
m
0
l
January 5, 2012 3:53:15 AM

Thanks, maybe a bit too much information as I pretty much know all this. The main point, which you seem to have somewhat affirmed is that a game running at 30FPS isn't going to be any less cpu intensive then a game running at 100FPS. I'll quote the actual statement that was made.

"And you don;t need 100FPS anyway: your eye cant see more than 30, so if you get anyting over 40, use vertSync to lock FPS to evrtical refresh rate (max 60 or 75), anything higher than that would be a waste of CPU power

And anything over 30FPS is goofd. If the game still stutters it isnt n FPS problem, you have to look at your video card settings"

To which I replied.
"Anything over 60 or 75 fps isn't going to be a waste of CPU power. The game will run as fast as it will based on your CPU frequency. Past that FPS is only limited by your video card. Running the game at 100fps isn't going make your game run slower then at 30fps."

Now would you say that was an erroneous statement I made?
m
0
l
a c 112 à CPUs
January 5, 2012 4:01:25 AM

sykozis said:
Anything beyond the refresh rate (60hz = 60fps, 120hz = 120fps) is completely wasted. A 60hz refresh display has an absolute maximum display capability of 60 frames per second. It completely disregards every frame beyond that.


not quite m8. the gfx will throw out 60+ fps and cause shearing. where half frames are displayed instead of full.
for a pc game 60 fps is all you need in real terms to fool the eye into thinking its seeing a sharp smooth animation. if you have less than 60fps you have to rely on the gfx cards ability to introduce blur, again fooling the eye into thinking its seeing smooth animation...
next time you watch a normal pal/ntsc movie look at it frame by frame and you will see 90 percent of the frames are blurred. your eye doesnt notice this because of the way your brain matrixes images that it cant fully make out... again fooling you into thinking your seeing a smooth movement. if you actually take a closer look at how film is made you will know that most movies are made at 24 fps but are played back at 25. this increases the sharpness to the eye again fooling the brain... this is why you see that film running times dont exactly match when you watch them on tv.

120hz monitors are only useful if you want 3d, so the gfx card can produce 60fps per eye 60x2=120 ;)  they are a waste of resources and money if you dont use 3d.
as far as im aware there is no game that will run 60 fps on the cpu. you need a gpu... maybe he was using cpu in the wrong context meaning computer which is both wrong and confusing...
cpu stands for central processing unit not a whole computer.


having more than 60 fps isnt a waste and doesn't starve the rest of the pc of cpu cycles. if it does it means you have a mismatched system where your gpu is overpowering your cpu. this is referred to as a bottleneck. we gamers try to avoid this at all costs... but 1s we get to 60 fps we turn on vsnc and purposely limit the pc to 60fps.

m8 it sounds like the guy is 1 of these, why should i upgrade guys... the kind of people who buy a pc and think it should last them 20 years and max out everything that they throw at it...
the sad thing is they will say anything to keep on justifying it to themselves. best thing is to say ok m8 and go buy a system that would make em cry.
m
0
l
a b à CPUs
January 5, 2012 7:55:42 AM

I think it wastes electricity
m
0
l
January 5, 2012 8:43:42 AM

esrever said:
I think it wastes electricity


Not really sure about that and if so I would say its negligible. When gaming you usually want your power scheme set for maximum performance anyways and the difference, if there even is one which I doubt, would be probably less then 1% between 60 or 75fps vs. 100 or even 30.

Thanks for all the comments guys. Me personally I always try to hit 60fps with v-sync on ideally. Anything below about 34fps starts looking a bit choppy to me so as long as my minimum stays above that I'm happy. Perhaps this bloke meant GPU power or just plain power consumption but whatever the cause I feel justified by my reply. Thanks for the input, much appreciated.
m
0
l
a b à CPUs
January 5, 2012 8:45:03 AM

electricity being the only power you'd really waste as you can't waste cpu power doing something that otherwise costs nothing.
m
0
l
a b à CPUs
January 5, 2012 10:10:44 AM

rfxcasey said:
I just read a post somewhere where this guy claims that running a game at over 60 or 75 fps is a "waste of CPU power" and by "power" I am assuming he is talking about CPU horsepower.

This doesn't sound right to me, correct me if I am wrong. But to me it would seem like a game is going to run as fast as it can, all things being equal, based on your CPU frequency and anything past that is down to video card. Understanding that anything over 60 or 75 fps isn't really going to make a noticeable difference in gameplay but is running a game at 100 fps going to cause other parts of the program to be starved for CPU cycles more so then running it at 60 or 75 fps? I've never seen or heard of a game running slower at 100fps then at 60, 75 or even 30fps but if it did I would assume it has something to do with poor vsync implementation or bad coding. Anyone please throw in your 2 cents.


i think the power mean wattage consumption that the guy is refferencing

Somewhat , running a game above 60fps uses more resources and thus can cause slowdown of background apps with a little margin when you have less resources available
m
0
l
a b à CPUs
January 5, 2012 10:52:06 AM

Quote:
In the end if you are building a PC only for games, then you want to balance you CPU with your GPU so that neither one is "robbing" performance from the other.


I HATE this argument, as it leaves no margin for future upgrading. The minute you slap a new GPU into the system, you now find the CPU is underpowered, and have to replace that too.

I always go CPU heavy, leaving myself room for 2-3 GPU upgrades before I junk the system as a whole.
m
0
l
January 5, 2012 4:29:21 PM

gamerk316 said:
Quote:
In the end if you are building a PC only for games, then you want to balance you CPU with your GPU so that neither one is "robbing" performance from the other.


I HATE this argument, as it leaves no margin for future upgrading. The minute you slap a new GPU into the system, you now find the CPU is underpowered, and have to replace that too.

I always go CPU heavy, leaving myself room for 2-3 GPU upgrades before I junk the system as a whole.



Oh, well they have fixed this now with CPU integrated graphics. :sarcastic: 
m
0
l
a b à CPUs
January 5, 2012 6:01:46 PM

+1. Some people just want a system for a couple of years, though (college, for instance), and want the best possible performance they can get for their money right now.
m
0
l
!