Sign in with
Sign up | Sign in
Your question

Why 30fps looks better on Cosole than on PC

Last response: in Video Games
Share
January 7, 2012 8:31:55 PM

Hi everybody , so now we have programs to cap fps -dxtory , bandicam, D3D9 antilag v1.01 can do this thing ,but it's seems that same games on console for example at 30fps looks much smoother and less noticeable frames per second when you turning camera in game . why this is happening, how to make it smoother on a PC.
for example i can run most games on my HD6870 at 40-60 FPS,but i don't like ups and downs so if i choosing to cap my frames at 30-35fps most games looks horrible even if my fps are stable ,never dips below ,but 30fps on console looks much better. Why there is such a big difference in the quality of 30fps Console vs PC even if PC has power to run the game on 60fps but capping at 30 makes it look bad.
Another example BF3 30fps on a console looks good ,30fps on a PC you can notice right away that 30fps is not enough

More about : 30fps cosole

January 7, 2012 9:11:59 PM

You usually sit closer to your monitor than to a television, games on console are from the ground up built around looking good at 30 frames (creative uses with elements like motion blurring etc), controllers are less precise than a mouse and keyboard and don't give you the same feeling of connection so you effectively dull one of your sensory avenues for estimating how smooth/fast things are happening on the screen, and the 30 fps never wavers or chugs. It is running as designed.

For all of those reasons and many more that I'm sure will get posted in this thread, you are more accepting of the 30 fps in most console games. However, if you suddenly ran a console game at 60 fps on a computer monitor that you were sitting less than 2 feet away from, it would look noticeably smoother than the same game running at its more likely 30 fps.

If you ran a pc game on a tv at 30 fps and sat 7 or 8 feet away playing it with a controller, you might not notice how crappy it is performing, despite the fact that it absolutely would be performing crappily.

In the same vein, film usually runs at about 20-25 fps equivalency, and it looks smooth when you watch it but part of that can have to do with the fact that you don't expect anything beyond what you're getting and you've never seen anything beyond what you're getting in that medium.
Related resources
January 8, 2012 12:13:11 AM

you really think it looks smoother? thats probably down to the console outputting 30hz, the slower response rate of a typical tv (often around 8-14ms) and syncing the frames, then there overly generous use of blur. all this helps fool the brain...
but as im a pc gamer i notice instantly that the consoles arnt smoother, on games like call of duty you really can see the difference. especially if you rotate fast... it looks like a slide show to me.
January 8, 2012 12:31:23 AM

casualcolors said:
You usually sit closer to your monitor than to a television, games on console are from the ground up built around looking good at 30 frames (creative uses with elements like motion blurring etc), controllers are less precise than a mouse and keyboard and don't give you the same feeling of connection so you effectively dull one of your sensory avenues for estimating how smooth/fast things are happening on the screen, and the 30 fps never wavers or chugs. It is running as designed.

For all of those reasons and many more that I'm sure will get posted in this thread, you are more accepting of the 30 fps in most console games. However, if you suddenly ran a console game at 60 fps on a computer monitor that you were sitting less than 2 feet away from, it would look noticeably smoother than the same game running at its more likely 30 fps.

If you ran a pc game on a tv at 30 fps and sat 7 or 8 feet away playing it with a controller, you might not notice how crappy it is performing, despite the fact that it absolutely would be performing crappily.

In the same vein, film usually runs at about 20-25 fps equivalency, and it looks smooth when you watch it but part of that can have to do with the fact that you don't expect anything beyond what you're getting and you've never seen anything beyond what you're getting in that medium.
actually I had console about 2years and I've played games on the same TV 24inch from the same distance like I'm playing now with my PC ,distance doesn't matter much ,it's something else . Even now on PC most games I'm playing with gamepad if game supports it ,not keyboard&mouse and there is a huge difference 30fps PC vs console when i slowly turning camera with gamepad . i experimented many games at 30fps on PC in some games you can even notice those frames very well,for example cap Skyrim at 30fps you will see how bad it looks compared to console and no matter what distance from a TV and even if there is no action ,trees on the screen or any other gpu demanding things those frames very noticeable unless you running game at least 45fps. So i know you said that the games on console are built to run good at 30fps but I'm still wonder why there is such a big difference in those 30fps pc vs console ,there is a same 30fps actually but the quality very different . I don't think that distance ,mouse and keyboard are biggest factors in this ,it's something more with hardware . Movies and fps they are different story but thanks any way for the effort .
January 8, 2012 12:37:56 AM

HEXiT said:
you really think it looks smoother? thats probably down to the console outputting 30hz, the slower response rate of a typical tv (often around 8-14ms) and syncing the frames, then there overly generous use of blur. all this helps fool the brain...
but as im a pc gamer i notice instantly that the consoles arnt smoother, on games like call of duty you really can see the difference. especially if you rotate fast... it looks like a slide show to me.
yes i notice that blurry screen very well when i jumped from pc crysis2 to console crysis2 on console was awful ,but console like xbox360 supports 1080p which means games are 60hz capped at 30fps if I'm not wrong
January 8, 2012 12:39:27 AM

Some of the keypoints that me and Hexit were referring to in addition to what you addressed though is use of tools to trick your eyes and your mind, like motion blurring. It creates the illusion of smooth gameplay even though things like motion blur are things competitive gamers turn off immediately in multiplayer b/c it impedes precision.
January 8, 2012 12:53:04 AM

casualcolors said:
Some of the keypoints that me and Hexit were referring to in addition to what you addressed though is use of tools to trick your eyes and your mind, like motion blurring. It creates the illusion of smooth gameplay even though things like motion blur are things competitive gamers turn off immediately in multiplayer b/c it impedes precision.
well then i guess that trick with blurring tricked me .
January 8, 2012 2:44:46 AM

laimis911 said:
yes i notice that blurry screen very well when i jumped from pc crysis2 to console crysis2 on console was awful ,but console like xbox360 supports 1080p which means games are 60hz capped at 30fps if I'm not wrong

sorry to tell you m8. the 360 supports 1080i for gaming not 1080p, 1080p is for video output only its a limit of the gfx card used as its a propietery version of the ati x1950 called xenos.
http://hardware.teamxbox.com/articles/xbox/1144/The-Xbo...
Quote:

All games supported at 16:9, 720p and 1080i, anti-aliasing

Standard definition and high definition video output supported

its a common mistake so dont be loosing sleep over it ;) 
October 31, 2012 12:15:32 PM

console games are laaaaagggggy as. check out need for speed most wanted 2013 on pc then play it on console or even just watch an xbox gameplay vid on youtube (I did both) the PC version is far superior & i have a lower end card than you lol. You must have an issue with your hardware or setup is all i can suggest

Also console games are running at lots less detail, field of view and with no eye candy no vertical sync etc etc etc course they aren't gonna struggle at 30 fps whereas a pc can play games way over 30 fps with all the gubbins (eye candy)
Check out borderlands 2 on console then on pc look at the sky on console no stars & dull grey sky but on pc sky full of stars the textures on console version are lame
watch this and lol at the console versions compared to the PC version
http://n4g.com/news/1084004/borderlands-2-graphic-compa...
October 31, 2012 1:37:17 PM

One thing I'd like to add, I'm not sure if this is true any longer but back in days of PS1, console games actually relied on TV screen to blurr the image slightly to make it look smoother. Me and my friend figured this out by connecting a PS1 to computer monitor and then to a TV and compare, the difference was impressive.

Today, it's probably not so much the case anymore with HD TVs and such.
December 8, 2012 3:44:26 PM

The only reason it looks so much smoother on a console is because consoles lock games at an fps that will always be handled. IE 30fps. On a PC, it may jump from 30 to 45 to 27 etc. It will only have a smoother feel if it consistently has the same fps. You could always try lowering the settings and adding a frame limit (depending on the game)
December 8, 2012 11:52:40 PM

I agree that it looks smoother at 30FPS than PC. My console is plugged into my PC monitor. When I get 30FPS in one of my PC games, it looks really choppy, but with the console it looks fairly smooth at all times. I guess it must be the less precise controller.
December 9, 2012 5:13:12 AM

Are you running the console on a tv, or monitior? If its a tv, thats probably why your console has more fluent 30fps than the pc. I dont know the technical terms but TVs tend to smoothen out its refresh rates to eliminate stuttering and blurrs. I know my samsung has a feature called movie plus that duplicates frames to reduce blurs and stutering. Many Tvs have this featuer enabled default by the manufacturer. Also you should realize that consoles have much less details than PC, so although they are both running at 30fps, the pc's frame timings and latency might lag behind while the consoles is 100% in sync with the fps. Try lowering the settings to match the console and then run them both at 30fps, they will end up looking the same
December 9, 2012 6:03:55 PM

this thread was 9 months old and dead before it got dragged up from the grave.....
December 9, 2012 8:12:51 PM

Its the undead thread......
December 9, 2012 9:02:47 PM

who cares how old it is. it brought up a relevant point even though the OP was mistaken in his belif that console games look better. there has only been 1 console port that looks better on console than on pc and thats gta 4. everything else is just a cut above due to the use of 1 basic feature of pretty much every pc game. FSAA.
if your pc is strong enough to run all the latest games then the pc wins hands down... but then again console gamers claim it costs less. again wrong... i dunno how many times i had to point out that the console may cost £200 but the tv they were playing on and often bought to play on cost 3-700 which puts it firmly in the same price bracket as a higher end pc which would in all likely hood play the games better.
February 14, 2013 2:36:12 PM

I think it's a load of crap from enthusiast PC gamers personally. While I do agree that there is a SUBTLE difference between 30 and 60 fps, both are EQUALLY playable. If I'm better than you at counterstrike I would still kill you whether I was getting 30 fps or 120 fps. With single player games the argument is even more ludicrous.

Below 30 fps is the threshold when the game starts to visibly chug along and it starts to hinder your ability to play properly, this is why the sweet spot for PC gaming is around 40 fps, where you have some headroom is you game dips in performance that you won't drop below 30 fps.

"The 60 fps or nothing" has always been a stigma of the PC gaming community and it's the same guys that post their rigs in their signatures and run 3-way sli when no current game on the market can even ultilize that type of power. It's like bragging rights for nerds, as opposed to who has the cooler car, it's who has the better rig.

And no Hexit, GTAIV doesn't look better on consoles, it certainly performs a lot better though. One of the worst optimized games of all time.. If you can achieve 40fps on max settings on every game there is absolutely zero reason to upgrade.
February 14, 2013 4:57:50 PM

So you think 40 fps with screen tears and fps dips to 30, which is very noticeable, is ideal for gaming? There is a huge difference between 30 and 60 fps. The reason enthusiasts demand 60 fps is it does offer a smoother gaming experience and also, because we can.
Get over it, if it isn't for you then that is fine, enjoy your games at whatever fps you determine is acceptable. Everyone differs in this area.
Dont be so spiteful in your comments.
February 14, 2013 5:15:35 PM

Personally I retract my earlier comments. I've been gaming at 15 fps and never been smoother all this fps-talk must just be PC-enthusiast voodoo. :sarcastic: 
February 14, 2013 5:56:29 PM

FlintIronStagg said:
So you think 40 fps with screen tears and fps dips to 30, which is very noticeable, is ideal for gaming? There is a huge difference between 30 and 60 fps. The reason enthusiasts demand 60 fps is it does offer a smoother gaming experience and also, because we can.
Get over it, if it isn't for you then that is fine, enjoy your games at whatever fps you determine is acceptable. Everyone differs in this area.
Dont be so spiteful in your comments.


I never said that it's impossible to notice a difference I just said that it is subtle and will never be the difference between being able to properly play a game or not. Playing under 30 fps causes noticeable slow downs that actually cause the game to become unplayable Anything over 30 fps and you will never see the game slow to a crawl. If you are getting tearing that is normally a v-sync issue as you should know you PC enthusiast. The OP solidifies my point "Why do console games not feel jittery at 30 fps, yet PC games do?" The answer is they are EXACTLY THE SAME. PC games feeling worse than their console counterparts at the same framerate is ALL IN YOUR MIND. The difference between 40 fps and 60 fps is very hard to tell, let alone while you actually playing the game and not sitting around staring at ***. The actual difference is that 40 fps has a higher chance of dropping below the 30fps mark, this is when you will notice it, you will not notice when the game drops from 60 to 35fps I guarantee it.

I mean justifying putting super high end graphics cards in SLI for just plain old 1080p, is fine if it helps you sleep at night with your purchase, but that kind of set-up is really only needed for multiple monitors or super high resolution monitors and if your not using either you are wasting your money I'm sorry to tell you. But, you already knew that didn't you PC enthusiast?

Queue that guy that links the bouncing ball gif at 30fps vs. 60 fps.... Maybe my eyes are bad or whatever, but to say that a console at 30 fps is any different than a PC at 30 fps is just illogical. If you can play console games at 30 fps and stand it than you can play PC games at 30 fps and stand it. I mean 30 fps is the BARE MINIMUM mind you, but it still is technically playable. Some or maybe most console games are locked at 30 fps, normally it's not good to lock you system because you are just limiting you graphics card performance. I'm just saying I've played Far Cry 3 at 40 fps, then tried it with the settings higher and only get 30, while I can tell the difference it's VERY hard and I only play at 40 because I'm scared that in times of heavy chaos on the screen I might dip into the <30 frames region.

Even Nvidia says 40 fps is the sweet spot that you want your games to play on, you going to argue with them now? I will concede and agree that 60>30 but 40 vs. 60 is really not worth and extra 250 dollar card for, and I think most would agree with me.
February 14, 2013 6:13:43 PM

bryjoered said:
I never said that it's impossible to notice a difference I just said that it is subtle and will never be the difference between being able to properly play a game or not. Playing under 30 fps causes noticeable slow downs that actually cause the game to become unplayable Anything over 30 fps and you will never see the game slow to a crawl. If you are getting tearing that is normally a v-sync issue as you should know you PC enthusiast. The OP solidifies my point "Why do console games not feel jittery at 30 fps, yet PC games do?" The answer is they are EXACTLY THE SAME. PC games feeling worse than their console counterparts at the same framerate is ALL IN YOUR MIND. The difference between 40 fps and 60 fps is very hard to tell, let alone while you actually playing the game and not sitting around staring at ***. The actual difference is that 40 fps has a higher chance of dropping below the 30fps mark, this is when you will notice it, you will not notice when the game drops from 60 to 35fps I guarantee it.

I mean justifying putting super high end graphics cards in SLI for just plain old 1080p, is fine if it helps you sleep at night with your purchase, but that kind of set-up is really only needed for multiple monitors or super high resolution monitors and if your not using either you are wasting your money I'm sorry to tell you. But, you already knew that didn't you PC enthusiast?

Vsync under 60 fps cuts you to 30 fps so the only way you would be running 40 fps is with vsync off, meaning you will get screen tear. An sli setup is a path deemed necessary by those that want constant 60 fps by preference and it shouldn't matter what other people prefer. You prefer erratic framerates with screen tear and lower settings while others can spend more money and enjoy smooth framerates with high settings. There really is no wrong here, just a matter of preference.

February 14, 2013 6:20:23 PM

Wrong again, you act like you always get screen tear if you don't use vsync, which isn't the case at all. Clearly, you are a PC enthusiast "you prefer erratic framerates". So, anything below 60 is erratic then? I mean why can't you just answer the simple question though? How do all the console gamers stand it?! There are many more of them mind you. Do you look at someone playing COD on Xbox and are like "OMGOSHH I CAN'T BELIEVE THE SCREEN TEARSH YOUR GETTING HOW DO YOUTH STANDSH ITS!" Pretty elitist and nerdy if you ask me, but hey to each his own.. :) 
February 14, 2013 6:20:58 PM

bryjoered said:
you will not notice when the game drops from 60 to 35fps I guarantee it.

Bull$h!t

I have a 120hz monitor, and I prefer all of my games at 120 frames over 60, because I can notice the difference. I can easily notice when the game dips below 120 frames down to around 60.

I can guarantee YOU that everyone can see the difference between 60 and 35fps, especially when a game drops down that low.
That's nearly a 50% cut in frames, and is significantly less smooth.

You also said earlier that more frames doesn't help you play better. Once again, not true. I guarantee I have the advantage in an FPS(like CoD or BF3) while running at 120fps over someone with 60fps. The same applies the having 60 fps vs someone with 30fps. You can turn and react faster the more frames you have. You could have the fastest reaction time in the world, and 30fps would still get you killed.
February 14, 2013 6:26:07 PM

Yeah, how do you explain that some of the best players in the world in both bf3 and counterstrike have crappy rigs? It's about time put in and skill, not your framerate.. I get 150 fps in counterstrike it's never helped me one bit..and you can't prove that having 120 frames helps you because there is no way to know how many frames the other person your up against is getting. You don't know that it makes you better, you think that it does, and that helps you justify the ridiculous sum of money you dumped into your SLI rig, sorry man it's ok..

Also, have fun upgrading that in a years time when you frames drop a hair below 60 and you still get pwned by a kid running a mediocre graphics card and a slow processor. Not saying that SLI rigs aren't cool, cuz obviously I want one, but they aren't really practical unless you want to do the surround monitor set up or the high resolution(1600p). 1600p I'm sure is jizz worthy, so if you have that set-up I'm jealous, but I'm sorry to say it, but it doesn't make you a better player or make your game look any better than mine with the same settings running at 40fps.
February 14, 2013 6:34:57 PM

I didn't say that higher frames makes you invicible and automatically puts you above all other players. It's part of the whole equation though.

And no, I don't have SLI. How nice of you to assume things I have a single GTX 680. There are times when I drop below 60fps too, and no, that doesn't make me want to upgrade.

How can you say that 150fps doesn't help you in counterstrike? Lock your fps at 30, then play a few rounds, then take off the fps lock. Compare the scores. Unless you have done that, you have no data to back up your claim.
February 14, 2013 6:40:50 PM

This is an example of correlation without causation though. You play one game with 120 fps, play the next one at 30fps. Happen to do better the first game, 120 fps makes you better right? Wrong, different players, other players not playing up to their potential or players having a good or bad game, or (gasp) you having a bad game. It's not something you can prove and I have tried just what you suggested in BF3 and I never had that "This is so much better" moment, it really felt about the same to me, that's what I'm getting at they feel very similar.

Your mindset going into this experiment could also alter your performance, everyone agrees that anything below 30fps is unplayable, and enthusiast claim that below 60 is unplayable, obviously I'd rather have 60 fps, but is it necessary? No, does it make a HUGE difference? No. Does it make any difference at all? Yes, but not game breaking at all.
February 14, 2013 7:01:12 PM

bryjoered said:

Your mindset going into this experiment could also alter your performance, everyone agrees that anything below 30fps is unplayable, and enthusiast claim that below 60 is unplayable, obviously I'd rather have 60 fps, but is it necessary? No, does it make a HUGE difference? No. Does it make any difference at all? Yes, but not game breaking at all.


So you're basing your opinion on the same unqualifiable logic that you're hoping to denounce? I realize it might be hard to realize that since you're stating your opinion, but the same thought process that brought you to this conclusion is what brings "enthusiasts" to theirs. You're asking the same questions but getting different answers.
February 14, 2013 7:22:46 PM

casualcolors said:
So you're basing your opinion on the same unqualifiable logic that you're hoping to denounce? I realize it might be hard to realize that since you're stating your opinion, but the same thought process that brought you to this conclusion is what brings "enthusiasts" to theirs. You're asking the same questions but getting different answers.


I see your point, but by the enthusiasts logic, someone getting 500 frames would have a huge advantage over everyone else, if they somehow had a 500hrz monitor and I just think this isn't the case. There's a certain point when smooth is smooth in my opinion. Smooth to me is no stutters or spots when the game is visibly chugging effecting causing a kind of rubber banding effect, or even partial freezing. This doesn't happen to me if I'm getting over 30 frames, the contention is, that only having 30 fps is bad because when it dips (it always does) it will take you into this terrible zone. So, as long as you have a buffer zone, 10-15 frames you should never or very rarely reach this zone.
February 14, 2013 7:25:47 PM

bryjoered said:
I see your point, but by the enthusiasts logic, someone getting 500 frames would have a huge advantage over everyone else, if they somehow had a 500hrz monitor and I just think this isn't the case.


This is only true if the enthusiast thinks there is no end to the range of human visual perception, which wouldn't make them an enthusiast. That would make them an idiot. And we're not debating the opinions of idiots here.
February 14, 2013 8:32:50 PM

casualcolors said:
we're not debating the opinions of idiots here.

whoa there don't get ahead of yourself, that's EXACTLY what we're doing here
February 14, 2013 8:41:46 PM

FlintIronStagg said:
whoa there don't get ahead of yourself, that's EXACTLY what we're doing here


I RESEMBLE THAT REMARK!
February 14, 2013 9:40:49 PM

In my opinion, based my my experiences, casualcolors note of input devices is the biggest difference. A mouse is FAR more precise and feels like an extension of the body compared to a controller or even keyboard.

This is the same thing that causes people to get sea sick or motion sick. Example, you are out in the sea on a boat, and you look only at objects on the boat, a lot of people get sea sick. This is because you view is telling your mind that nothing is going on, but in reality, you are constantly going up and down with the swelling of the ocean. This conflict causes your mind to induce nausea, which is theorized to be your body trying to induce vomiting because it thinks you may be poisoned. Now the same person can focus on land or the horizon, which makes it clear that you are moving up and down, and as a result, you don't get sick.

The mouse is much the same. If you move the mouse and it moves your view as you move your hand, your mind expects a constant connection between your hand movements and what you see on the screen. Low FPS causes lots of latency, causing a disconnect, which for many people causes problems. Even if you don't experience nausea (I do), most people can at least perceive that latency.

A controller does not give you the same connection a mouse does. You push a button which causes the screen to turn. Your mind never feels like your view is in direct coloration with body, so it tolerates poor latency.
May 8, 2013 6:32:48 AM

I don't get it either. I'm very new to PC gaming, having been a console gamer since consoles began but rarely bothering with PC's over the years and this is what i've found.

I've just built a PC using the AMD A10 APU....now before the PC Elite get on their high horses, I had a budget smaller than the amount you'd spend on a single component and wanted to give PC gaming a try...so went cheap with the intention of adding a proper GPU later if I can ever get used to PC gaming.

So...low powered graphics, on par with a PS3 or thereabouts.
Same TV used as a display (60Hz LCD) - 720p resolution used on both
Sitting in the same position/distance
Using the same PS3 control pad (don't have a desk so keyboard is not an option)
Playing the same game...in this instance the Far Cry 3 Blood Dragon Demo.
Set the field of view on the PC to match the PS3 (PC default around 70...PS3 can't be changed but around 80)
PC graphics set to low.

OK...so that's the scenario, the PC should outperform the PS3 considerably. Using fraps to get the frame rates i'm getting in the region of 40-50 frames per second for the most part and as low as 30 at times.

I have no idea of frame rates on PS3 but it plays as smooth as other Shooters on PS3...mostly very smooth with the odd hectic bit that gives a noticeable impact on smoothness.

How come then that the PC at 40-50fps isn't as smooth as the PS3 under as close to the same conditions as I can replicate? I've matched the parameters as much as possible, controller/screen/viewing distance/graphics levels.....the frame rates of lets say 45fps for arguments sake on PC is simply not anywhere near as smooth as what i'm assuming is 30fps on console.

Are we saying this is down to blurring effects on console given that i've eliminated as many other factors as possible? If so how do I turn these on to make PC gaming as enjoyably smooth as console gaming?
May 8, 2013 7:31:38 AM

ColSonders said:
I don't get it either. I'm very new to PC gaming, having been a console gamer since consoles began but rarely bothering with PC's over the years and this is what i've found.

I've just built a PC using the AMD A10 APU....now before the PC Elite get on their high horses, I had a budget smaller than the amount you'd spend on a single component and wanted to give PC gaming a try...so went cheap with the intention of adding a proper GPU later if I can ever get used to PC gaming.

So...low powered graphics, on par with a PS3 or thereabouts.
Same TV used as a display (60Hz LCD) - 720p resolution used on both
Sitting in the same position/distance
Using the same PS3 control pad (don't have a desk so keyboard is not an option)
Playing the same game...in this instance the Far Cry 3 Blood Dragon Demo.
Set the field of view on the PC to match the PS3 (PC default around 70...PS3 can't be changed but around 80)
PC graphics set to low.

OK...so that's the scenario, the PC should outperform the PS3 considerably. Using fraps to get the frame rates i'm getting in the region of 40-50 frames per second for the most part and as low as 30 at times.

I have no idea of frame rates on PS3 but it plays as smooth as other Shooters on PS3...mostly very smooth with the odd hectic bit that gives a noticeable impact on smoothness.

How come then that the PC at 40-50fps isn't as smooth as the PS3 under as close to the same conditions as I can replicate? I've matched the parameters as much as possible, controller/screen/viewing distance/graphics levels.....the frame rates of lets say 45fps for arguments sake on PC is simply not anywhere near as smooth as what i'm assuming is 30fps on console.

Are we saying this is down to blurring effects on console given that i've eliminated as many other factors as possible? If so how do I turn these on to make PC gaming as enjoyably smooth as console gaming?

Meh, I even just recently bought a new kickass card (GTX 670 FTW) Guess what? There really is hardly a noticeable difference, I can play games at higher settings than I used to, but playing BF3 on the same exact settings (Ultra) there is no difference at all, these enthusiasts are a bunch of elitists loser's who LITERALLY get boners off their framerates. I'm mediocre at BF3 whether I'm playing at 40 or 70 fps.

May 8, 2013 8:10:20 AM

@ColSonders:

Your test game has a noted issue with stuttering on PC's. You may also have had v-sync on, which creates stuttering if you are not at a solid 30 FPS, or 60 FPS. Anywhere in between will cause stuttering.
May 8, 2013 8:59:02 AM

bystander said:
@ColSonders:

Your test game has a noted issue with stuttering on PC's. You may also have had v-sync on, which creates stuttering if you are not at a solid 30 FPS, or 60 FPS. Anywhere in between will cause stuttering.


I mean dude 680 SLI what do you even do with that? I hope you are running at 1600p or that surround vision stuff with that much power. On a side note if you have a slightly newer nvidia card Adapative V-sync is great, it turns off v-sync under 60fps and turns it on when you are pushing past 60fps ensuring that your card is never working too hard or being throttled down.
On the other hand you are the only legitimate enthusiast on here that has a real argument towards why 60fps is better, the lower framerate makes you sick, which I can't understand because it has no effect on me. To state that the actual gameplay or input is any different anywhere between 35-500fps is really just false and an illusion in my opinion. I paid nearly $400 for the GTX 670 and it works great, but on if I bump the settings down to what I used to have on my older card, it's really no different at all to me, and I really try to tell the difference since I paid that much for it. Running Crysis on very high on the other hand makes me realize why I paid so much.
May 8, 2013 9:14:50 AM

bryjoered said:
bystander said:
@ColSonders:

Your test game has a noted issue with stuttering on PC's. You may also have had v-sync on, which creates stuttering if you are not at a solid 30 FPS, or 60 FPS. Anywhere in between will cause stuttering.


I mean dude 680 SLI what do you even do with that? I hope you are running at 1600p or that surround vision stuff with that much power. On a side note if you have a slightly newer nvidia card Adapative V-sync is great, it turns off v-sync under 60fps and turns it on when you are pushing past 60fps ensuring that your card is never working too hard or being throttled down.
On the other hand you are the only legitimate enthusiast on here that has a real argument towards why 60fps is better, the lower framerate makes you sick, which I can't understand because it has no effect on me. To state that the actual gameplay or input is any different anywhere between 35-500fps is really just false and an illusion in my opinion. I paid nearly $400 for the GTX 670 and it works great, but on if I bump the settings down to what I used to have on my older card, it's really no different at all to me, and I really try to tell the difference since I paid that much for it. Running Crysis on very high on the other hand makes me realize why I paid so much.


I play in 3D Vision at 60 FPS minimum or 2D at 80+ FPS. As I explained, lower FPS causes me to get sick. You can call it an illusion, but it is a real affection. Look up simulator sickness. This latency issue is real. If your FPS are at 30, you have 33.3 ms between when a frame is started to be rendered, and when it actually gets displayed, at 60 FPS, that is 16.7ms, and at 120 FPS, it drops to 8.3ms.

While it may not affect you (it doesn't for most people), it affects me. Looking up the Oculus, which even gives you more of direct feed back from your head turning, people are getting sick on it quite easily: http://www.pcper.com/reviews/General-Tech/Video-Perspec...

John Carmack has mentioned that in order for the Oculus Rift to become acceptable, they need latencies below 20ms, and more recently, mentioned that below 10ms is going to be needed. This because the more real things are, the more sensitive we become to imperfections.

Anyways, it is possible that the problem isn't entirely about latency for me. However, no matter how you look at it, if I don't maintain at least 60 FPS, I get sickness in about 30 mins or less. At 60 FPS, it takes an hour or so, and at 80+ FPS, I stop getting sick. I also notice immediately when my FPS drop below 60 FPS. Then I turn my head to see my G13's LCD display and see what I have, and try to make setting adjustments to stop it from happening again.
May 8, 2013 9:25:47 AM

Hmm, is the 3d vision worth it? Maybe that's what causes some of your sickness. Like I said you have a legitimate claim and reasoning for the absolute need of 60fps. These other kids are just flailing their e-peens around stating that their game plays better at 60 fps than someone Else's played at 40fps and that is just hogwash. The threshold is 40fps listed by almost all benchmarking sites. You want an average of 40fps, this means that when the game dips it won't go under 30, which is when the game starts to feel unplayable.
May 8, 2013 9:34:39 AM

bryjoered said:
Hmm, is the 3d vision worth it? Maybe that's what causes some of your sickness. Like I said you have a legitimate claim and reasoning for the absolute need of 60fps. These other kids are just flailing their e-peens around stating that their game plays better at 60 fps than someone Else's played at 40fps and that is just hogwash. The threshold is 40fps listed by almost all benchmarking sites. You want an average of 40fps, this means that when the game dips it won't go under 30, which is when the game starts to feel unplayable.


I find 3D at 60 FPS and 2D at 60 FPS, makes me equally sick, so it doesn't really matter. Like I said, in 2D, I go for 80+ FPS for nearly no sickness, but in 3D, I'm forced to 60 FPS or less, so I make sure to maintain 60 FPS to limit the sickness. 3D itself doesn't seem to cause me additional issues at all, as 60 FPS feels the same in either mode.

To answer the question about 3D being worth it...it is for me. I find it far more immersive than 2D.
May 8, 2013 11:59:23 AM

The hell happened to this thread? The answer is simple.

Motion blur.

Blurs the frames so the differences between the frames, being shown at a relatively slow rate, are not so noticeable; this makes the sequence of images appear smoother.
Most PC games don't have this. Why? Ask the developer.

/thread
May 8, 2013 12:15:07 PM

MajinCry said:
The hell happened to this thread? The answer is simple.

Motion blur.

Blurs the frames so the differences between the frames, being shown at a relatively slow rate, are not so noticeable; this makes the sequence of images appear smoother.
Most PC games don't have this. Why? Ask the developer.

See, it's not as simple as that though, Most people don't realize that anything on a monitor played at 1080p 2 feet from your face is going to be more noticeable when there is fluctuations. My only point is that 30fps would most likely be right on the border of what is considered "playable". Most benchmarking sites say 40 so that you have a cushion to drop below that "playable" threshold. Consoles are normally limited to 30fps, so 30fps is the absolute fastest you will ever see, so when things get choppy you actually are in the teens. I would say that you want 45-50 fps to be absolutely sure that you will never drop below 40. In terms of playability the difference between 40 and 60 is zilch to me and apparently to top rated benchmarking sites like tom's hardware as well...

/thread


May 8, 2013 1:32:51 PM

bryjoered said:
MajinCry said:
The hell happened to this thread? The answer is simple.

Motion blur.

Blurs the frames so the differences between the frames, being shown at a relatively slow rate, are not so noticeable; this makes the sequence of images appear smoother.
Most PC games don't have this. Why? Ask the developer.

/thread

See, it's not as simple as that though, Most people don't realize that anything on a monitor played at 1080p 2 feet from your face is going to be more noticeable when there is fluctuations. My only point is that 30fps would most likely be right on the border of what is considered "playable". Most benchmarking sites say 40 so that you have a cushion to drop below that "playable" threshold. Consoles are normally limited to 30fps, so 30fps is the absolute fastest you will ever see, so when things get choppy you actually are in the teens. I would say that you want 45-50 fps to be absolutely sure that you will never drop below 40. In terms of playability the difference between 40 and 60 is zilch to me and apparently to top rated benchmarking sites like tom's hardware as well...


I don't think I've ever seen a site claim there is no difference between 40 and 60 FPS. They say 40 is playable, but never have I ever heard them say it was optimal, or that more was not better.

Like I mentioned a few times, I actually get sick from the difference. While most people won't get sick, it does show that it is something that is noticeable.
May 8, 2013 1:51:07 PM

bystander said:
bryjoered said:
MajinCry said:
The hell happened to this thread? The answer is simple.

Motion blur.

Blurs the frames so the differences between the frames, being shown at a relatively slow rate, are not so noticeable; this makes the sequence of images appear smoother.
Most PC games don't have this. Why? Ask the developer.

/thread

See, it's not as simple as that though, Most people don't realize that anything on a monitor played at 1080p 2 feet from your face is going to be more noticeable when there is fluctuations. My only point is that 30fps would most likely be right on the border of what is considered "playable". Most benchmarking sites say 40 so that you have a cushion to drop below that "playable" threshold. Consoles are normally limited to 30fps, so 30fps is the absolute fastest you will ever see, so when things get choppy you actually are in the teens. I would say that you want 45-50 fps to be absolutely sure that you will never drop below 40. In terms of playability the difference between 40 and 60 is zilch to me and apparently to top rated benchmarking sites like tom's hardware as well...


I don't think I've ever seen a site claim there is no difference between 40 and 60 FPS. They say 40 is playable, but never have I ever heard them say it was optimal, or that more was not better.

Like I mentioned a few times, I actually get sick from the difference. While most people won't get sick, it does show that it is something that is noticeable.


I think that there is a difference, I just feel that it is minor and shouldn't really make a DRAMATIC impact on your experience with the game, except in your case. Playable is playable, meaning that there is no "slideshow" effect, no chugging, that could potentially hamper your ability to play the game properly. The difference between 60fps and 40fps is MUCH harder to see when compared to 20fps and 40fps, do you see what I'm getting at now?
May 8, 2013 1:58:56 PM

I wouldn't argue against that, other than to say it still may be worth it to some, and not others.
May 9, 2013 12:49:10 AM

bystander said:
@ColSonders:

Your test game has a noted issue with stuttering on PC's. You may also have had v-sync on, which creates stuttering if you are not at a solid 30 FPS, or 60 FPS. Anywhere in between will cause stuttering.


Thanks for clarifying...poor game example seems to be the case, obviously very poorly implemented onto PC, I see why all the hate towards games being ported from consoles now if this is the standard they have.

Having read up a bit on how V-Sync works and fiddling with settings I had a go of BF3 last night and even on my poor little rig with no GPU it ran very nicely at 30fps when I had some anti-alaising and V-Sync on....almost on par with the smoothness of BF3 on the PS3, just with nicer graphics. Then switching off the AA bumped me straight up to 60fps with a noticable jump in smoothness.

So I guess it's very dependant on games.

Thanks for the info, I guess it's a good reason to try before you buy on PC to see if it's possible to make a game run smoothly on your system or not (at least for those of us with low end graphics)


Edit * I should also add that the whole frame rate argument in my opinion is very much in the eye of the beholder. Most console gamers are tarred with the brush that they'd not notice the difference because they're just not used to seeing games at more than 30fps....but I (long time console gamer) can easily tell the difference between 20/25/30/60 fps but I know a lot that aren't as sensitive to it.

Bottom line seems to be that you should probably balance your game settings for smoothness first and add "pretty" until it stops being what you consider smooth.
May 9, 2013 2:54:58 AM

its very dependent on games. you can play counterstrike at 30 fps and ytou will get kills but playing the same game at 120 fps on a game that ties its packets to the frame rate will often make you more accurate than the guy playing at 30 fps who is getting 30 packets as opposed to your 120.
you are getting 4 times the amount of information which should result in you being able to see him first most of the time.

after reading most of this i can honestly say console players have very little idea of how hardware works. how games transfer data between host and players. this isnt there fault as its all been hidden away from them in the GUI's
console gaming isnt as smooth as pc and it isnt locked at 30 fps. if you played alan wake you would know this. its 1 of many titles that have severe jittering and thats going from 30 fps to 25. on pc you would hardly notice that 5 fps drop on a pc game running at 60 fps dropping to 55, and if your playing at the optimum settings for your card you should be getting 60 fps anyway.

sorry to say it but console gamers just dont understand gaming. like i said its not there fault because there so used to not seeing under the hood. you will see women claim the same when there cars run out of oil... uh! it needs oil... you didnt tell me that... well now she knows she still doesnt put the oil in...
consoel gamers=women... end of story ;) 
May 9, 2013 3:23:53 AM

HEXiT said:
its very dependent on games. you can play counterstrike at 30 fps and ytou will get kills but playing the same game at 120 fps on a game that ties its packets to the frame rate will often make you more accurate than the guy playing at 30 fps who is getting 30 packets as opposed to your 120.
you are getting 4 times the amount of information which should result in you being able to see him first most of the time.

after reading most of this i can honestly say console players have very little idea of how hardware works. how games transfer data between host and players. this isnt there fault as its all been hidden away from them in the GUI's
console gaming isnt as smooth as pc and it isnt locked at 30 fps. if you played alan wake you would know this. its 1 of many titles that have severe jittering and thats going from 30 fps to 25. on pc you would hardly notice that 5 fps drop on a pc game running at 60 fps dropping to 55, and if your playing at the optimum settings for your card you should be getting 60 fps anyway.

sorry to say it but console gamers just dont understand gaming. like i said its not there fault because there so used to not seeing under the hood. you will see women claim the same when there cars run out of oil... uh! it needs oil... you didnt tell me that... well now she knows she still doesnt put the oil in...
consoel gamers=women... end of story ;) 


Yeah you're right, I had no idea that packet data was related to graphical frame rate, that also makes zero sense, why would your graphical performance alter the network bandwidth used? I'm happy to be wrong on this, but I'm gonna call that one as rubbish.

Saying that console gaming isn't as smooth as PC is just stupid though...not implying YOU are stupid...but that statement is, because console hardware is a constant, and the games are designed to run on that specific constant...that doesn't apply to PC, unless you're saying my APU is on par with a Titan graphics card. Plus my example earlier is proof of that point, Blood Dragon is stuttery on PC (i'll define that as MY PC) compared to on console.

You don't need to understand the underlying mechanics of a game to appreciate it in a similar way that most PC users have no concept of how their computer actually works...yet they appreciate it...granted a few of the PC elitists i'm sure actually know the ins and outs of computing.

So basically what you were doing was having a dig at console gamers to defend your precious PC gaming because some of those console gamers who are trying PC gaming were confused at why games don't run as smoothly on PC as they do on console....surely you would want MORE people playing games on PC to make it more mainstream? What you are doing is attempting to alienate PC gaming by putting off console gamers (read the vast majority of gamers).....so well done you, that really is what i'd consider stupid.
May 9, 2013 6:15:33 AM

ColSonders said:
HEXiT said:
its very dependent on games. you can play counterstrike at 30 fps and ytou will get kills but playing the same game at 120 fps on a game that ties its packets to the frame rate will often make you more accurate than the guy playing at 30 fps who is getting 30 packets as opposed to your 120.
you are getting 4 times the amount of information which should result in you being able to see him first most of the time.

after reading most of this i can honestly say console players have very little idea of how hardware works. how games transfer data between host and players. this isnt there fault as its all been hidden away from them in the GUI's
console gaming isnt as smooth as pc and it isnt locked at 30 fps. if you played alan wake you would know this. its 1 of many titles that have severe jittering and thats going from 30 fps to 25. on pc you would hardly notice that 5 fps drop on a pc game running at 60 fps dropping to 55, and if your playing at the optimum settings for your card you should be getting 60 fps anyway.

sorry to say it but console gamers just dont understand gaming. like i said its not there fault because there so used to not seeing under the hood. you will see women claim the same when there cars run out of oil... uh! it needs oil... you didnt tell me that... well now she knows she still doesnt put the oil in...
consoel gamers=women... end of story ;) 


Yeah you're right, I had no idea that packet data was related to graphical frame rate, that also makes zero sense, why would your graphical performance alter the network bandwidth used? I'm happy to be wrong on this, but I'm gonna call that one as rubbish.

Saying that console gaming isn't as smooth as PC is just stupid though...not implying YOU are stupid...but that statement is, because console hardware is a constant, and the games are designed to run on that specific constant...that doesn't apply to PC, unless you're saying my APU is on par with a Titan graphics card. Plus my example earlier is proof of that point, Blood Dragon is stuttery on PC (i'll define that as MY PC) compared to on console.

You don't need to understand the underlying mechanics of a game to appreciate it in a similar way that most PC users have no concept of how their computer actually works...yet they appreciate it...granted a few of the PC elitists i'm sure actually know the ins and outs of computing.

So basically what you were doing was having a dig at console gamers to defend your precious PC gaming because some of those console gamers who are trying PC gaming were confused at why games don't run as smoothly on PC as they do on console....surely you would want MORE people playing games on PC to make it more mainstream? What you are doing is attempting to alienate PC gaming by putting off console gamers (read the vast majority of gamers).....so well done you, that really is what i'd consider stupid.


I mean coming from someone liked me who upgraded to a GTX 670 FTW (Almost a 680 stock) just to potentially see this difference, I just don't. Even a game running at 60 fps could drop into the high 30s at some intense point, especially in multiplayer games, when other players effect performance as well. I hardly notice when a game dips into the high 30s, but I do admit that there is a slight difference, but it's nothing that effect the movement of your character or the input into the game. If it dips below 30 I immediately notice that drastic dip in quality and smoothness, that's the point, it isn't apples to apples and it's not a linear gain. 60 fps is slightly smoother than 40fps, but 40fps is like an entirely different game than 20fps.
May 9, 2013 7:02:13 AM

you guys.... Ok let me explain this so you can get the point.

FPS as well as almsot anything in the world is something we get used to.

a 30 FPS that will not move from 30 fps more tahn 1 fps up or down, will seem smooth.

In a PC, if you get a game moving from the 60 FPS to the 30 FPS, you will notice it, but you will notice it a lot more if it goes from 25-30 or from 30-40.

The reason is your brain.
It gets 30 fps, it gets used to 30 fps, it then enjoys 30 fps. YOu start feeding it random FPS drops and it sees the flaw in the overall picture motion.

Also, note that FPS in itself is not EXACTLY they way you need to measure the fluency of the game.

You need to know the frame interval exchange (time between each frame). If this time varies also (and in Console Ports it does a lot) your brain will CLEARLY see the changes.

You might "think" you are seeing a smooth yet a bit choppy performance but your brain does not agree.
In order to make this issue vanish compleatly, game need to run at 200+ FPS (by then even the brain cant keep up), or the frames must have consistent timings.

About consoles... well, first of all, if you run at 30 fps flat, its easier to code the game to look good, and another thing that normally ads to this point is the fact that PCs normally have a lot of crapware installed apart from games.

Anyone who formated his PC 1-2 days ago and jsut started his favorite game will tell you it "looks" like it goes a lot smoother. Thats also not a coincidence.

Finally, PC games do tend to have better quality, making the final product less easy to code eficiently (assuming a company even bothers to do that).
!