Can someone explain the the target FPS to me, and why?
Tags:
- FPS
-
Graphics
Last response: in Graphics & Displays
Nucl3arbong
October 17, 2013 12:32:49 AM
Hey y'all I'll make this short and sweet...I'm jumping back into PC gaming after being gone for at least 6-7 years. Back in the day playing Q3 and Enemy Territory the goal was to max your fps at 125 for optimum strafe jumping etc. Now i'm seeing more of the reference towards 60 as the goal. With the higher end cards does the 125 not meaningful anymore?
I've been having a rough time deciding on a video card for this system for months. I started off planning on a gtx770 which led to a 780. Now with the new AMD releases it's between 2 280x's, 780, 290, 290x. I also have 2 120hz monitors if that means anything.
Thanks for your help....I think this will help me make my decision a little easier.
I've been having a rough time deciding on a video card for this system for months. I started off planning on a gtx770 which led to a 780. Now with the new AMD releases it's between 2 280x's, 780, 290, 290x. I also have 2 120hz monitors if that means anything.
Thanks for your help....I think this will help me make my decision a little easier.
More about : explain the target fps
workedog
October 17, 2013 1:41:56 PM
60 is the goal because that's generally considered the minimum for seamless experience and also the clock rate on most monitor. The clock speed of your monitor is the absolute maximum fps you will be able to see so in your case you would benefit from 120 fps. In modern shooters like BF4 and Crysis 3 you'll have a hard time staying there on 2 screens without 2 top-end cards though.
m
0
l
cookybiscuit
October 17, 2013 1:45:19 PM
It's because getting to 120FPS in modern titles is near impossible, even if you reduce ingame settings all the way you'll often find the CPU holds back framerate.
60FPS is fine, I don't find 120FPS is that much better, though is nice to have. Personally find the largest benefit from 120Hz is the near non-existent input lag and vertical tearing, which is an effect present even at 60FPS.
60FPS is fine, I don't find 120FPS is that much better, though is nice to have. Personally find the largest benefit from 120Hz is the near non-existent input lag and vertical tearing, which is an effect present even at 60FPS.
m
0
l
Related resources
- Can someone please explain to me why my Playstation 2 to USB controller adapter will just stop working? - Forum
- I would like someone to explain me why does my graphic card(nvidia evga gtx 650) not give my monitor signal? - Forum
- Can Someone Explain to Me Why ... - Forum
- Can Someone Explain This Transistor Diagram to Me? - Forum
- Can someone explain why 2x780 SLI isn't even enough to run all games on ultra @ 1440p? - Forum
Nucl3arbong
October 17, 2013 2:31:51 PM
workedog said:
60 is the goal because that's generally considered the minimum for seamless experience and also the clock rate on most monitor. The clock speed of your monitor is the absolute maximum fps you will be able to see so in your case you would benefit from 120 fps. In modern shooters like BF4 and Crysis 3 you'll have a hard time staying there on 2 screens without 2 top-end cards though.Actually I'm gonna use my 27" for gaming on PC, PS3, and PS4 by using a switch. Then use my 24" for my PC duties. I have been using the larger for console and smaller for an extended laptop screen. Just waiting for my gpu solution then ordering the PC.
I don't think I'm gonna get too much bottlenecking from the CPU as it's gonna be an oc'd 4770k.
m
0
l
Best solution
Yuka
October 17, 2013 2:49:06 PM
Well, things have changed with LCD displays against how CRTs worked, since you now have to look a little deeper.
In CRTs, 60Hz is not remotely close to what LCD's 60Hz means. In fact, in LCDs the word "refresh rate" doesn't apply and is just a "legacy term" to compare both worlds. Now, since its made to compare, an LCD (LED, IPS, etc) with 60Hz can reproduce up to 60FPS without "tearing" having vertical sync off: it doesn't force "jumps" in perceived refresh rate to the eye like having it on, but if you go past the refresh rate, the display will start drawing the next frame before the previous one is done producing what we call tearing. Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping the excess of frames, of course) it will jump back and cut the frames to 30FPS. So, if you have the situation where your video card plays most of the time at 100FPS (having v-sync on makes it 60FPS to the eye put on the monitor), but in some scenarios goes to, say 59FPS, then the driver (or game engine) will cut the frames given to the display and put it at 30FPS (which is the next jump for v-sync) causing a really annoying effect in perceived motion.
That's just one aspect of the things in relation to LCDs. The next one is something very particular to the LCD tech behind the monitor: pixel clock. This is usually represented in the "2/5/8 ms" number that LCDs have and is mandated by the type of cable/connector you'll be using (HDMI, VGA, DVI, DP, etc) IIRC. What this means in practical terms for monitors is that, a higher number will produce "ghosting". 5ms is usually a borderline case, so 2ms is what you want in terms of LCD monitors. And ghosting in LCDs is VERY noticeable at higher frame rates when you have high contrast (black and while image moving around, for example). I might have the technical part a bit wrong, but the effect produced is correct at least.
SOOOO... If you want to have a good FPS experience, then go for a high refresh rate monitor over a big screen IMO.
Cheers!
In CRTs, 60Hz is not remotely close to what LCD's 60Hz means. In fact, in LCDs the word "refresh rate" doesn't apply and is just a "legacy term" to compare both worlds. Now, since its made to compare, an LCD (LED, IPS, etc) with 60Hz can reproduce up to 60FPS without "tearing" having vertical sync off: it doesn't force "jumps" in perceived refresh rate to the eye like having it on, but if you go past the refresh rate, the display will start drawing the next frame before the previous one is done producing what we call tearing. Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping the excess of frames, of course) it will jump back and cut the frames to 30FPS. So, if you have the situation where your video card plays most of the time at 100FPS (having v-sync on makes it 60FPS to the eye put on the monitor), but in some scenarios goes to, say 59FPS, then the driver (or game engine) will cut the frames given to the display and put it at 30FPS (which is the next jump for v-sync) causing a really annoying effect in perceived motion.
That's just one aspect of the things in relation to LCDs. The next one is something very particular to the LCD tech behind the monitor: pixel clock. This is usually represented in the "2/5/8 ms" number that LCDs have and is mandated by the type of cable/connector you'll be using (HDMI, VGA, DVI, DP, etc) IIRC. What this means in practical terms for monitors is that, a higher number will produce "ghosting". 5ms is usually a borderline case, so 2ms is what you want in terms of LCD monitors. And ghosting in LCDs is VERY noticeable at higher frame rates when you have high contrast (black and while image moving around, for example). I might have the technical part a bit wrong, but the effect produced is correct at least.
SOOOO... If you want to have a good FPS experience, then go for a high refresh rate monitor over a big screen IMO.
Cheers!
Share
Nucl3arbong
October 17, 2013 2:58:56 PM
Yuka said:
Well, things have changed with LCD displays against how CRTs worked, since you now have to look a little deeper.In CRTs, 60Hz is not remotely close to what LCD's 60Hz means. In fact, in LCDs the word "refresh rate" doesn't apply and is just a "legacy term" to compare both worlds. Now, since its made to compare, an LCD (LED, IPS, etc) with 60Hz can reproduce up to 60FPS without "tearing" having vertical sync off: it doesn't force "jumps" in perceived refresh rate to the eye like having it on, but if you go past the refresh rate, the display will start drawing the next frame before the previous one is done producing what we call tearing. Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping the excess of frames, of course) it will jump back and cut the frames to 30FPS. So, if you have the situation where your video card plays most of the time at 100FPS (having v-sync on makes it 60FPS to the eye put on the monitor), but in some scenarios goes to, say 59FPS, then the driver (or game engine) will cut the frames given to the display and put it at 30FPS (which is the next jump for v-sync) causing a really annoying effect in perceived motion.
That's just one aspect of the things in relation to LCDs. The next one is something very particular to the LCD tech behind the monitor: pixel clock. This is usually represented in the "2/5/8 ms" number that LCDs have and is mandated by the type of cable/connector you'll be using (HDMI, VGA, DVI, DP, etc) IIRC. What this means in practical terms for monitors is that, a higher number will produce "ghosting". 5ms is usually a borderline case, so 2ms is what you want in terms of LCD monitors. And ghosting in LCDs is VERY noticeable at higher frame rates when you have high contrast (black and while image moving around, for example). I might have the technical part a bit wrong, but the effect produced is correct at least.
SOOOO... If you want to have a good FPS experience, then go for a high refresh rate monitor over a big screen IMO.
Cheers!
Thx for the very in-depth explanation. I kinda new that but was nice to have the extra assurance. That said I think I'm gonna eliminate the duel 280's as I'd like as much speed as I can get. More then likely going with a single higher end card for now, then SLI/crossfire it maybe a month after.
m
0
l
workedog
October 17, 2013 3:22:39 PM
Yuka said:
Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping the excess of frames, of course) it will jump back and cut the frames to 30FPS. So, if you have the situation where your video card plays most of the time at 100FPS (having v-sync on makes it 60FPS to the eye put on the monitor), but in some scenarios goes to, say 59FPS, then the driver (or game engine) will cut the frames given to the display and put it at 30FPS (which is the next jump for v-sync) causing a really annoying effect in perceived motion.That's not how vsync works. Tearing only occurs above the refresh rate, so vsync will cap the frame rate at that level. That's all it does. Below the maximum frame rate it will do nothing, I'm not even sure what gave you that idea in the first place.
m
0
l
Yuka
October 17, 2013 6:31:42 PM
workedog said:
That's not how vsync works. Tearing only occurs above the refresh rate, so vsync will cap the frame rate at that level. That's all it does. Below the maximum frame rate it will do nothing, I'm not even sure what gave you that idea in the first place.http://en.wikipedia.org/wiki/Screen_tearing
Don't trust my words, but I speak the truth.
Cheers!
EDIT: Some more links for joy!
http://en.wikipedia.org/wiki/Refresh_rate
http://pcgamingwiki.com/wiki/Vertical_sync_(Vsync)
m
0
l
bystander
October 17, 2013 6:50:25 PM
Neither of you had it correct.
Without v-sync there is tearing, no matter what your FPS are. However, tearing is most noticeable when your FPS is close to your refresh rate or higher.
LED's don't flicker, but they still refresh the display in the same manner that CRT's do. Instead of having constantly send an electron beam at florescent displays to keep the image shown, LED/LCD's are solid state, and simply turn on pixels on and off depending on what color they are showing. That does not mean they don't constantly go through the pixels and update the image to what ever is in the front buffer at your refresh rate (60hz for most monitors). So while there is no flicker, tearing occurs in the same manner as it does with CRT's and tearing occurs any time the refresh times and GPU times are not synchronized, which is any time v-sync is off. Even if you have a constant 60 FPS, the GPU is able to update the front buffer at the same time the display is refreshing its image, resulting in a tear, unless v-sync is on.
Without v-sync there is tearing, no matter what your FPS are. However, tearing is most noticeable when your FPS is close to your refresh rate or higher.
LED's don't flicker, but they still refresh the display in the same manner that CRT's do. Instead of having constantly send an electron beam at florescent displays to keep the image shown, LED/LCD's are solid state, and simply turn on pixels on and off depending on what color they are showing. That does not mean they don't constantly go through the pixels and update the image to what ever is in the front buffer at your refresh rate (60hz for most monitors). So while there is no flicker, tearing occurs in the same manner as it does with CRT's and tearing occurs any time the refresh times and GPU times are not synchronized, which is any time v-sync is off. Even if you have a constant 60 FPS, the GPU is able to update the front buffer at the same time the display is refreshing its image, resulting in a tear, unless v-sync is on.
m
0
l
Yuka
October 17, 2013 7:02:03 PM
Uhm... No, I did get it right...
http://www.geforce.com/hardware/technology/adaptive-vsy...
I found that from nVidia itself.
Cheers!
http://www.geforce.com/hardware/technology/adaptive-vsy...
I found that from nVidia itself.
Cheers!
m
0
l
bystander
October 17, 2013 7:49:50 PM
Yuka said:
Uhm... No, I did get it right...http://www.geforce.com/hardware/technology/adaptive-vsy...
I found that from nVidia itself.
Cheers!
Sorry, but no, you got that wrong. You missunderstood what they mean by "observed". The tearing occurs regardless, it is just more obvious when it is above your refresh rate, and I can provide an easy test for you to see for yourself very easily. (This topic is very commonly misunderstood btw).
Set your refresh to 60hz. Take MSI Afterburner and cap your FPS to 59. Now play a game without v-sync on in which you can maintain 59 FPS. When you turn your view you'll clearly see tearing moving up or down your screen. Set it to 60 fps and it'll be just as clear.
Even in the wiki post you gave before, it said "The artifact occurs when the video feed to the device isn't in sync with the display's refresh." That is any time, not when it is above your refresh.
Anyways, don't take my word for it, do the test I provided to you.
EDIT: here is another link that is interesting: http://www.hardocp.com/article/2012/04/16/nvidia_adapti...
Quote:
The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed.
m
0
l
workedog
October 17, 2013 8:07:21 PM
Yuka said:
Uhm... No, I did get it right...http://www.geforce.com/hardware/technology/adaptive-vsy...
I found that from nVidia itself.
Cheers!
That technology is over a year old now, my motherboard came bundled with a similar product. This kind of vsync is the standard now.
m
0
l
Yuka
October 18, 2013 6:44:15 AM
bystander said:
Sorry, but no, you got that wrong. You missunderstood what they mean by "observed". The tearing occurs regardless, it is just more obvious when it is above your refresh rate, and I can provide an easy test for you to see for yourself very easily. (This topic is very commonly misunderstood btw).Set your refresh to 60hz. Take MSI Afterburner and cap your FPS to 59. Now play a game without v-sync on in which you can maintain 59 FPS. When you turn your view you'll clearly see tearing moving up or down your screen. Set it to 60 fps and it'll be just as clear.
Even in the wiki post you gave before, it said "The artifact occurs when the video feed to the device isn't in sync with the display's refresh." That is any time, not when it is above your refresh.
Anyways, don't take my word for it, do the test I provided to you.
EDIT: here is another link that is interesting: http://www.hardocp.com/article/2012/04/16/nvidia_adapti...
Quote:
The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed.You're discussing a problem when the display driver overrides what the graphical engine tells it to do: v-sync. It is handled at engine level (or driver) and then afterburner caps the frame to be un-synched with the refresh rate; that's an artificial handicap. And in regular cases, like HardOCP correctly states, it happens but you barely notice it because it happens when the driver has to cut the frame rate to 30FPS instead of keeping it at 60FPS; there's a very small amount of time where you'll get tearing, but it's hardly noticeable (luckily). That's why you get tearing below 60FPS.
On the other hand, when you don't have the frame cap and just v-sync, So you're discussing what the link inside nVidia says, but its called "stuttering": a perceived change in motion. As simple as that.
Also, the same link you provide assures my own statements... ~___________~
I've been playing with this issue since i was 12 years old playing Quake 2. Trust me, I've done my homework more than enough to be pretty sure of whats going on here.
workedog said:
That technology is over a year old now, my motherboard came bundled with a similar product. This kind of vsync is the standard now.Your MoBo could not have come with that tech, since it's implementation is exclusive to nVidia (adaptive v-sync). I think you're referring to Virtu Logic's MVP software. It does provide the same, yes, but it provides it at a different level than nVidias. Not a bad product, but needs more refining since it quirks out a lot of games.
Cheers!
m
0
l
bystander
October 18, 2013 6:51:40 AM
Yuka said:
bystander said:
Sorry, but no, you got that wrong. You missunderstood what they mean by "observed". The tearing occurs regardless, it is just more obvious when it is above your refresh rate, and I can provide an easy test for you to see for yourself very easily. (This topic is very commonly misunderstood btw).Set your refresh to 60hz. Take MSI Afterburner and cap your FPS to 59. Now play a game without v-sync on in which you can maintain 59 FPS. When you turn your view you'll clearly see tearing moving up or down your screen. Set it to 60 fps and it'll be just as clear.
Even in the wiki post you gave before, it said "The artifact occurs when the video feed to the device isn't in sync with the display's refresh." That is any time, not when it is above your refresh.
Anyways, don't take my word for it, do the test I provided to you.
EDIT: here is another link that is interesting: http://www.hardocp.com/article/2012/04/16/nvidia_adapti...
Quote:
The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed.You're discussing a problem when the display driver overrides what the graphical engine tells it to do: v-sync. It is handled at engine level (or driver) and then afterburner caps the frame to be un-synched with the refresh rate; that's an artificial handicap. And in regular cases, like HardOCP correctly states, it happens but you barely notice it because it happens when the driver has to cut the frame rate to 30FPS instead of keeping it at 60FPS; there's a very small amount of time where you'll get tearing, but it's hardly noticeable (luckily). That's why you get tearing below 60FPS.
On the other hand, when you don't have the frame cap and just v-sync, So you're discussing what the link inside nVidia says, but its called "stuttering": a perceived change in motion. As simple as that.
Also, the same link you provide assures my own statements... ~___________~
I've been playing with this issue since i was 12 years old playing Quake 2. Trust me, I've done my homework more than enough to be pretty sure of whats going on here.
workedog said:
That technology is over a year old now, my motherboard came bundled with a similar product. This kind of vsync is the standard now.Your MoBo could not have come with that tech, since it's implementation is exclusive to nVidia (adaptive v-sync). I think you're referring to Virtu Logic's MVP software. It does provide the same, yes, but it provides it at a different level than nVidias. Not a bad product, but needs more refining since it quirks out a lot of games.
Cheers!
I flat told you that there is always tearing. I also said it is not as noticeable, which is why the myth persists. So no, it did not back you up in saying there is no tearing. It did say it was hardly noticeable (though there are a number of people who still complain about tearing below 60 FPS). It is also very noticeable when you are near a harmonic of your refresh rate (60 FPS, 30 FPS, 120 FPS ...).
Also, there is no stuttering, there is no 30-60 FPS flip flopping without v-sync. That is a V-sync only feature, without v-sync it is not necessary to only update between refreshes.
m
0
l
Yuka
October 18, 2013 7:39:28 AM
bystander said:
I flat told you that there is always tearing. I also said it is not as noticeable, which is why the myth persists. So no, it did not back you up in saying there is no tearing. It did say it was hardly noticeable (though there are a number of people who still complain about tearing below 60 FPS). It is also very noticeable when you are near a harmonic of your refresh rate (60 FPS, 30 FPS, 120 FPS ...).Yes, I will admit I get the "under 60FPS tearing" part wrong, since it's hardly noticeable and I almost never use v-sync, I forgot about it.
Still, ironically, that's an artificial byproduct of v-sync, haha.
bystander said:
Also, there is no stuttering, there is no 30-60 FPS flip flopping without v-sync. That is a V-sync only feature, without v-sync it is not necessary to only update between refreshes.Yes, I also state/know that. At least we agree on something. Maybe the way I said it was weird to understand, but there is no sudden jump in frame rates when v-sync is off, just tearing at some points above monitor refresh rate.
Cheers!
m
0
l
bystander
October 18, 2013 9:57:12 AM
Check this out. This was just announced yesterday I believe. G-sync....now this will make things work as you made it sound like originally, in the near future. This is a very cool tech coming to market soon: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Announc...
m
0
l
Yuka
October 18, 2013 10:52:25 AM
I'm VERY sure it's going to be closed and proprietary form nVidia. As good as it might sound, I think it will suffer the same destiny as PhysX: a few people that fell for the marketing will get it paying a ridiculous nVitax for it.
As long as we can manually adjust the quality settings to fit the refresh rate, I think this won't be needed. Specially since monitors should start getting increased refresh rates in upcoming years, making this tech a lot less useful now. Maybe when the LCDs were starting to appear it would have made a lot more sense.
In any case, not a bad thing. It's always good to see a good idea from a vendor.
Cheers!
As long as we can manually adjust the quality settings to fit the refresh rate, I think this won't be needed. Specially since monitors should start getting increased refresh rates in upcoming years, making this tech a lot less useful now. Maybe when the LCDs were starting to appear it would have made a lot more sense.
In any case, not a bad thing. It's always good to see a good idea from a vendor.
Cheers!
m
0
l
bystander
October 18, 2013 10:57:16 AM
Yuka said:
I'm VERY sure it's going to be closed and proprietary form nVidia. As good as it might sound, I think it will suffer the same destiny as PhysX: a few people that fell for the marketing will get it paying a ridiculous nVitax for it.As long as we can manually adjust the quality settings to fit the refresh rate, I think this won't be needed. Specially since monitors should start getting increased refresh rates in upcoming years, making this tech a lot less useful now. Maybe when the LCDs were starting to appear it would have made a lot more sense.
In any case, not a bad thing. It's always good to see a good idea from a vendor.
Cheers!
As long as it is not too pricy, I'd hope AMD also supports it, in which case it will changing the PC front.
m
0
l
Yuka
October 18, 2013 11:03:11 AM
If the keynote is of any indication, it's a co-developed thing for Monitors*. I wonder if it works within HDMI and DVI specs. Maybe they'll implement it using DP, since its full digital.
Oh well, what I think is that they'll use the same "pay us to use this tech or you can't use it" as they did with CUDA and PhysX.
I hope for the best, and expect the worst as usual
Cheers!
EDIT: * -> I saw Asus, Phillips, Viewsonic and BenQ. I could be very wrong about co-development since it looks like an nVidia "attachment" for monitors. If that's the case, I wonder if they can move it to their video cards...
Oh well, what I think is that they'll use the same "pay us to use this tech or you can't use it" as they did with CUDA and PhysX.
I hope for the best, and expect the worst as usual
Cheers!
EDIT: * -> I saw Asus, Phillips, Viewsonic and BenQ. I could be very wrong about co-development since it looks like an nVidia "attachment" for monitors. If that's the case, I wonder if they can move it to their video cards...
m
0
l
bystander
October 18, 2013 11:04:18 AM
Related resources
- Can someone tell me why SLI visual indicator drops half green bar and my fps drops ? Forum
- SolvedCan someone explain monitor resolution to me? Forum
- SolvedCould someone quite simply just explain the different HDD modes to me...IDE...ACHI etc Forum
- SolvedCan someone explain M.2 SSDs to me? Forum
- SolvedCan someone explain Intel turbo boost for me? Forum
- SolvedCan someone explain to me PCi/GPU question? Forum
- SolvedCan someone explain to me bronze vs silver etc PSUs? Forum
- SolvedCan someone explain in every way why building a pc is better than buying one. Forum
- SolvedCan someone please explain these connectors to me. Forum
- SolvedCan someone explain slots to me? Forum
- SolvedCan someone explain why people recommend this GTX660 GPU over another one? Forum
- Solvedcan someone explain me (Dual Core 8 threads) is it possible? Forum
- SolvedCan someone explain me the difference between i5 4670 or i5 4670k Forum
- Can someone explain to me what causes my screen to show these artifacts? Forum
- SolvedCan someone explain thread sizes to me (watercooling) Forum
- More resources
Read discussions in other Graphics & Displays categories
!