I have an ASUS EAH4870x2 in my computer with a bunch of other high end components (can list if needed) and a samsung 32in LCD HDTV. I mainly play world of warcraft and I've noticed that i rarely have fps issues, even in heavy 25man raiding, but I do cap out at 60 fps, have never seen it above 60. In the TV's manual, it stated that I should set the option in my graphics card's setup to 60hz, as that was the tv's spec. What I want to know is, is this what is limiting me to 60fps? if I were to perhaps purchase a newer tv to play on, one of the 120hz or 240hz models now available, would this increase my maximum potential fps? or is the issue something else entirely and the 60hz/60fps thing entirely coincidental? thanks for any help and sorry if this is a total noob question!
that'll probably uncap your fps, but either way it doesn't really matter because your monitor is still 60hz and it won't show anything more than that.
but really, your eye can only take in so much, and i doubt you'd be able to see a difference between 60 fps and 600 fps
May 15, 2010 11:36:15 PM
so, just to confirm, you're telling me that i am correct in guessing that 60hz refresh= 60fps max? i know that much above 20-30 fps your eyes can't tell any difference, i just want to see what my card is actually capable of, I always see people talking about getting 140fps playing this game or that and a guy gets curious, you know
60 hz refresh rate means the monitor cannot display more than 60 frames per second. You can get more than that but your monitor will only actually be displaying 60. If you have vsync on you will not go over your refresh rate. Somewhere around 50 fps is probably where you will not be able to tell a difference any more.
sorrry but your eye aint a camera, our eyes does not see in 60 fps... it is just that it is getting so smooth that you does barley not notice any difference but i personaly think i can notice a difference between 60 fps and 600fps, dont think it will be a big difference but i belive it will feel a little bit smoother
monitor refreshes 60 times per secont, so it can only possibly display 60 frames every second. If you turn vsync off your graphics card can output more than 60 frames per second to the monitor, but the frames will be out of sync with the monitors refresh cycle, so you will end up with part of one frame and part of the next frame on the monitor when it refreshes, cuasing screen tearing. http://www.anandtech.com/show/2794
theres a must read article to read about vsync, and triple buffering and shows you in pictures what is happening on your screen with and without vsync and with triple buffering.
Human eyes not only can see a difference between 60 and 600fps but they can also see a difference between 60 and 75fps, dont let the marketing talk fool you, the reason why most monitors are limited to 60hertz is because of limited bandwith that older hdmi supports.
I had a proper CRT monitor which could go up to 150hertz, there was an amazing difference between 100 and 150, let alone 60 to 600...
The 600hertz talk is mostly found in plasmas to attract costumers, but in reality they are too 60hertz, the so called 600hertz is what their processor is capable of processing.
I strongly recommend turning vsync off in games like fps or any game that requires fast reflexes, if you have for example 60fps, and your monitor is set to 60hertz, then you will still notice a pretty big difference with lets say 1000fps at 60hertz, simply because 10000 refreshes faster than 60hertz, and by the time it reaches your screen the image is much more recently updated than if you were at 60fps, thats why you also note screen tearing because your monitor displays the next frame before finishing the last one, but i digress, the point was:
Your eyes can absolutely notice far more than some mediocre 60hertz.