Sign in with
Sign up | Sign in
Your question
Solved

GTX 680 or GTX 670, for 120 frames on Battlefield 3

Last response: in Graphics & Displays
Share
July 13, 2012 3:32:38 AM

Hey guys!

My first post here :D .

So, I just built a new pc a couple of days ago. I originally put a 7870 inside but unfortunately it was disappointing. So I have decided to go with either a 670 or 680 for my new build.

I play battlefield 3 competitively. So this means that I need to get solid 120 frames minimum(I use 120hz monitor) on all settings low/off at 1080p across all maps.

I looked at both 670 and 680 and they seem pretty similar. Although the 680 is a bit better in terms of performance, the 670 is cheaper by about a 100 bucks.

Now the issue arises because of the fact that my budget allows me to buy the 680. I am really confused. 670 is cheaper, but then again the price difference is only a 100$(relatively since we are talking about cards worth 400$ at the very least).

So, I would love to hear your opinions on this situation. Also, if you have any of the two cards, it would be really helpful if you could post the frames you get on low.

Thanks!

July 13, 2012 4:06:54 AM

You can't beat this....

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

plus 3 free games....

That card will do just fine for you.

However, if you have to go the GTX route, without a doubt go with the 670... Its the same damn card for $100 less...



I couldn't imagine the fps I would get with my 7970s at that res on the lowest settings... On ultra in campaign i've hit 205fps... I know multiplayer is much different.
July 13, 2012 4:12:55 AM

OK so you have a 120 Hz monitor

can you even notice the difference between 60 fps and 120? Is it the nanosuit you are wearing that allows you to see what the human eye cant?
Related resources
July 13, 2012 4:35:48 AM

Quote:
OK so you have a 120 Hz monitor

can you even notice the difference between 60 fps and 120? Is it the nanosuit you are wearing that allows you to see what the human eye cant?


Yes. In fact there is a huge difference. "But the human eye can see only 30 frames". Although I can't actually comment about the biology of it, here is something you might want to look at:

http://amo.net/NT/02-21-01FPS.html

Coming back to the practical side of things, there is a huge difference between 120 and 60. Same when you move from 30 hz on console games to 60 hz on pc counterparts. Even if you might not actually be able to see a difference, one thing that I think is important is that you see the most updated frame at all times, since the screen is refreshed faster. In comp. games, it makes a huge difference.

Subjective arguments such as "Oh I have a 120hz monitor but I don't see any difference", are useless. The average user does not feel much of a difference changing mouse dpi in small amounts, like going from 1500 to 1400 dpi, but hardcore fps players do. Not the best example there, but you probably know what I'm talking about.

To sum it up, switching to 120hz monitor has been a similar experience to the time I switched to a mechanical keyboard; really hard to go back now.
July 13, 2012 4:38:32 AM

zloginet said:
You can't beat this....

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

plus 3 free games....

That card will do just fine for you.

However, if you have to go the GTX route, without a doubt go with the 670... Its the same damn card for $100 less...



I couldn't imagine the fps I would get with my 7970s at that res on the lowest settings... On ultra in campaign i've hit 205fps... I know multiplayer is much different.


El Tigre said:
That 7970 bundle ain't a bad deal. If you want PhysX, Adaptive VSYNC and such, go for the EVGA GTX 670 FTW, it the best bag for your buck card atm.

It's $400 after $10 rebate.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...


This card even beats a reference GTX 680 and even the ASUS GTX 670 CUII TOP that cost $30 more.

http://hexus.net/tech/reviews/graphics/40613-evga-gefor...



This indeed is a good deal. Do you guys know about the frames that the 7970 gets on low?
a b Î Nvidia
July 13, 2012 4:45:23 AM

You mean low settings? I'm not sure as I can't find a benchmark of BF3 using low settings. With the GTX 670, you should be able to play on high above 60 FPS and maybe 100+ FPS on medium. Why play it on low if you mind answering?

Also note that BF3 favors Nvidia cards, so expect lower FPS with the 7970.
July 13, 2012 5:27:36 AM

El Tigre said:
You mean low settings? I'm not sure as I can't find a benchmark of BF3 using low settings. With the GTX 670, you should be able to play on high above 60 FPS and maybe 100+ FPS on medium. Why play it on low if you mind answering?

Also note that BF3 favors Nvidia cards, so expect lower FPS with the 7970.


I play on low for a variety of reasons. Most importantly for maximum frames. I like to get atleast 120 frames, and then limit it to that. I also found it to be a bit easier to spot enemies with some post processing turned off, although that's just me; could be a placebo effect. Keep in mind in tournaments, you don't have 3d- spotting(you probably know what this is if you have played any bfbc game). So, basically anything that helps me get the advantage on the enemy.

I am still leaning towards the gtx cards.
July 13, 2012 5:31:23 AM

devjeetroy said:
Quote:
OK so you have a 120 Hz monitor

can you even notice the difference between 60 fps and 120? Is it the nanosuit you are wearing that allows you to see what the human eye cant?


Yes. In fact there is a huge difference. "But the human eye can see only 30 frames". Although I can't actually comment about the biology of it, here is something you might want to look at:

http://amo.net/NT/02-21-01FPS.html

Coming back to the practical side of things, there is a huge difference between 120 and 60. Same when you move from 30 hz on console games to 60 hz on pc counterparts. Even if you might not actually be able to see a difference, one thing that I think is important is that you see the most updated frame at all times, since the screen is refreshed faster. In comp. games, it makes a huge difference.

Subjective arguments such as "Oh I have a 120hz monitor but I don't see any difference", are useless. The average user does not feel much of a difference changing mouse dpi in small amounts, like going from 1500 to 1400 dpi, but hardcore fps players do. Not the best example there, but you probably know what I'm talking about.

To sum it up, switching to 120hz monitor has been a similar experience to the time I switched to a mechanical keyboard; really hard to go back now.


^_^ finally another well informed fps gamer.

I've been saying for years how much better 120fps gaming is than 60fps but only as of lately do I see more and more people wanting 120hz monitors for fps gaming :) 

I have two EVGA vanilla 670s that I plan on using for a 120hz monitor or a 1440p monitor. Either way it will work wonderfully :) 

Right now I can easily run BF3 1080p 60fps vsync 4x MSAA FXAA Ultra settings no Motion Blur or Ambient Occlusion.
a b Î Nvidia
July 13, 2012 5:41:28 AM

Guys, he's not playing the game for eye candy, he's playing competitively, possibly professionally. At that level there is almost no difference between bf3 and wack-a-mole, you see the enemy on you screen Nd you click on his face. It doesn't matter what his face looks like.

Also, the human eye is capable of tetecting changes well above 30 frames per second. If you flash a series of completely different pictures in front of someone, most people will notice well beyond 30 or even 60 frames. It's when the pictures are strings in motion that detection becomes blurred. But the OP is playing so he sees the enemy the exact 0.5ms he shows up,and he wants no later.

As for the 7970, it can just about match a 680 and should easily hit 120fps flat on low
a b Î Nvidia
July 13, 2012 5:43:46 AM

To the OP, I know you don't care about eye candy, but I must remind you that on low far away textures can blend in. You may not see that sniper's head that you would see on high details
July 13, 2012 5:49:44 AM

vmem said:
To the OP, I know you don't care about eye candy, but I must remind you that on low far away textures can blend in. You may not see that sniper's head that you would see on high details


E.G. get 4x 670s and call it a day.
a b Î Nvidia
July 13, 2012 5:51:16 AM

akamrcrack said:
E.G. get 4x 670s and call it a day.

Well a pro needs pro gear :kaola: 
July 13, 2012 5:59:24 AM

Outlander_04 said:
OK so you have a 120 Hz monitor

can you even notice the difference between 60 fps and 120? Is it the nanosuit you are wearing that allows you to see what the human eye cant?

I suggest you do some research on 120 Hz monitors because clearly you have no idea what you're talking about and why 120 Hz monitors are good.
July 13, 2012 6:36:45 AM

vmem said:
To the OP, I know you don't care about eye candy, but I must remind you that on low far away textures can blend in. You may not see that sniper's head that you would see on high details


That's a good point. Depending on the map I do run some stuff such as mesh quality and antialiasing(i dont remember which one) on medium/high. But usually scope glint can still be very clear, at the ranges where texture blend-in can occur, so it is not usually an issue. Given a choice, I would still run it on low to get 120fps rather than running higher quality textures.
July 13, 2012 6:37:57 AM

akamrcrack said:
E.G. get 4x 670s and call it a day.


LOL.

Help me out with it? I'll pm my paypal :bounce: 
July 13, 2012 6:52:13 AM

devjeetroy said:
LOL.

Help me out with it? I'll pm my paypal :bounce: 


:lol:  :non:  ;)  one must first become the card before he can own the card. Ebb n flow.

But srsly I think (based on what I bought these cards for) that two 670s should run 120hz vsync no problem with lower settings. Which is what my goal was as well :) 

99.99% of the time in games, I would rather have lower settings for first person shooters because I would get higher frames (>100) and would still easily be able to see the opposition. While I wait to get a screen that is capable of doing that to my setup (vsync master race), I found that running at 60fps with everything important maxed out the game is much funner to watch (poor optimizations on DICE's part, I blame consoles ofc.) Anti-Aliasing can't fix whats wrong with BF3 lol but 4x MSAA helps to make it less apparent that everything isn't uber amazing looking. However lol, find me a system that can run >1440p and 4x MSAA and i'll be a dog in a park.

*Crawls back in his potcave* gnite :hello: 
July 13, 2012 7:46:49 AM

devjeetroy said:
I play on low for a variety of reasons. Most importantly for maximum frames. I like to get atleast 120 frames, and then limit it to that. I also found it to be a bit easier to spot enemies with some post processing turned off, although that's just me; could be a placebo effect. Keep in mind in tournaments, you don't have 3d- spotting(you probably know what this is if you have played any bfbc game). So, basically anything that helps me get the advantage on the enemy.

I am still leaning towards the gtx cards.



I used to be a pro league player in Counter Strike and now in COD MW 2&3 and I totally get the settings you mentioned... to be honest for FPS you get the best performance with Nvidia, in this case GTX670 is enough. I think in FPS is critical to have constant frame rate and real fast response on movement with no graphic twitching. For me Nvidia is the right choice, because it offers the best constant frame rate in FPS!

Based on what I've seen, the 7970 is HUGE compared with 670 and 680! Do you have enough space on ur case?

peace.
July 13, 2012 4:47:02 PM

I can promise you the human eye can't process any information above 65fps. You can see up to around 50 but after that it's peripheral vision till about 65. No stimuli information will be processed over that and no reflexes to sound and noise (superior and inferior colliculi) will register any faster. The only reason to run 120 fps is if you need to go 3-D or you just need the ego boost. Whatever the case any single high-end card will require low-medium settings to reach 120mhz.

Below is tested at High setting with each card only getting about 97 fps. Assuming you overclock you'll get another 6 fps. If you have a larger monitor prepare for a drop of about 20 fps. Also it is easier to pick-out specific object on lower setting because it's a clearer, more defined picture making the information much more salient than playing with all the occlusions and extra contrast from textures. Also you don't have to worry about a lot of the weird movement in the game set for realism often.

http://www.tomshardware.com/charts/2012-vga-gpgpu/13-Ba...
July 13, 2012 6:38:30 PM

cepheid said:
I can promise you the human eye can't process any information above 65fps. You can see up to around 50 but after that it's peripheral vision till about 65. No stimuli information will be processed over that and no reflexes to sound and noise (superior and inferior colliculi) will register any faster. The only reason to run 120 fps is if you need to go 3-D or you just need the ego boost. Whatever the case any single high-end card will require low-medium settings to reach 120mhz.

Below is tested at High setting with each card only getting about 97 fps. Assuming you overclock you'll get another 6 fps. If you have a larger monitor prepare for a drop of about 20 fps. Also it is easier to pick-out specific object on lower setting because it's a clearer, more defined picture making the information much more salient than playing with all the occlusions and extra contrast from textures. Also you don't have to worry about a lot of the weird movement in the game set for realism often.

http://www.tomshardware.com/charts/2012-vga-gpgpu/13-Ba...

Im sorry, but you're wrong. Have you ever seen a 120hz monitor in person? I just got one, and I can assure you that you can see a difference. 120 frames is so much smoother. I prefer 3d to 120 frames, but 120 frames is still amazing.
July 13, 2012 6:40:32 PM

Ok, well you may think you can but neuroscience says otherwise. Eat all the placebos you want hombre.

Best solution

a c 332 Î Nvidia
July 13, 2012 6:54:20 PM
Share

It's not what you 'see' at higher frames, but what you 'feel' in the responsiveness of the controls.

BF3 significantly favors the 670 and 680 over the 7970. Even the new 7970 GHz Edition with the 12.7 drivers loses to a 670 at stock settings on BF3. This one's a "no-brainer". It looks like to me the 670 will keep up with the 680 at lower settings and is cheaper. As a pro gamer, however, I would recommend you get the most card you can afford (i.e. the 680):


http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-e...
July 13, 2012 6:55:22 PM

The best frame rates are those that match-up with the firing of your optic nerve's action potentials. You essentially get on on the right wavelength and frequency and you see better. You might see better at 85fps or 65fps than at 120fps. This is why people get headaches after long hours of gaming. It's also why people buy those stupid glasses that are designed to mitigate the frequency. Receiving information, due to the refractory period after one of your neurons firing (rods, cones, bipolar cells, etc), you cannot retain any information cognitively over approx 55-65 and anything above that is simply a crap shoot on seeing better which is based on environment. So no, the 120mhz monitor really doesn't make a difference. What you "feel" is better that you have a 120mhz unless it's specifically synced with lighting and environmental it won't provide a better anything. But that's just how the fun world of science sees it, maybe they'll prove that wrong in the future. Until then by all means embrace the beauty of the 120 mhz monitor.
July 13, 2012 7:02:09 PM

Dude, I just bought a 120hz monitor, and I am telling you that there is a difference. I'll ask again, have you ever experienced one in person? If not, then maybe you should before you say that there is no difference between 60 and 120 frames.

OP, a single 670 should work fine. I can't guarantee 120 frames 100% of the time though. I just got a 120hz monitor with the Nvidia 3D vision kit. I run a 2500k @ 4.4ghz and a reference GTX 680 at stock clocks. When playing Battlefield 3 64 player multiplayer maps at 1080p, I manage to get 60 frames most of the time when running in 3D, which reduces frames by 40-60%. Note though that this is at mostly high settings, with mesh on low, motion blur off, and no hbao or ssao, and no AA. So, if ran on ALL low settings, with that Evga FTW 670, which should be as fast as my 680, you should get 120 frames almost all the time. However, there will always be times when you drop below that, like when there are a ton of explosions or a lot of stuff happening. It's just the way it is, I'm sure you know, that with a game that just came out its simply not possible to be at 120 frames 100% of the time.

Best of luck with running the game, and playing it competitively!
July 13, 2012 7:06:04 PM

Not only have I seen one in person I have one in the room adjacent to the one I'm in now. If you say it's better then it's better for you. I'm just saying that research into human cognition and neuropsychology say otherwise. If you feel a difference then you have the right monitor for you. *shrug*.
July 13, 2012 7:17:01 PM

Okay, well maybe certain people don't notice it as much as others, perhaps it has to do with how good your vision is, but I don't know. I'm not a neurologist, just have a lot of experience with technology. I know that with 3d, certain people can't see it. They suffer from something called stereo blindness. This happens when someone has good vision in only one eye, which makes viewing 3D content difficult/impossible. Maybe certain conditions interfere with seeing 120 frames effectively.

But I know some people can for sure. Before I bought my monitor, I read many experiences with them from tech sites. Almost all of them said that after you see 120 frames, you will never want to go back. As soon as I set the refresh rate to 120hz, I could see the increased smoothness from simply moving icons around on the desktop. I even showed my dad, who is not experienced with technology at all, the difference between 60hz and 120hz. He agreed that when set to 120hz, things were much smoother.

But, regardless of what we think, OP can tell the difference between 60 and 120 frames, and needs that to benefit him when playing competitively. To do that, I think a 670 will do the job, and. 680 will do it better, but will cost a bit more.
July 13, 2012 7:19:55 PM

[/i]
trogdor796 said:
Okay, well maybe certain people don't notice it as much as others, perhaps it has to do with how good your vision is, but I don't know. I'm not a neurologist, just have a lot of experience with technology. I know that with 3d, certain people can't see it. They suffer from something called stereo blindness. This happens when someone has good vision in only one eye, which makes viewing 3D content difficult/impossible. Maybe certain conditions interfere with seeing 120 frames effectively.

But I know some people can for sure. Before I bought my monitor, I read many experiences with them from tech sites. Almost all of them said that after you see 120 frames, you will never want to go back. As soon as I set the refresh rate to 120hz, I could see the increased smoothness from simply moving icons around on the desktop. I even showed my dad, who is not experienced with technology at all, the difference between 60hz and 120hz. He agreed that when set to 120hz, things were much smoother.

But, regardless of what we think, OP can tell the difference between 60 and 120 frames, and needs that to benefit him when playing competitively. To do that, I think a 670 will do the job, and. 680 will do it better, but will cost a bit more.


Yes, 670 will do the job for what he's looking for and it's an amazing deal. Best purchase in years for the performance as far as I'm concerned. Nvidia won this round hands down imo. I hope they workout the MSI gtx 670 PE voltage unlock for AB.

I'm not saying it won't look smoother for you as it's possible that 120mhz could be the number for you but a majority of people that say theirs a huge difference are either a) someone with a vested interest in you purchasing it, or b) placebo. I'm not trying to piss on your purchase but I do certainly make an attempt to remove the hype from these monitors because the likelihood of random environmental stimuli mixed a the monitor and frequency of light being bathed hitting your number in is very slim but possible. Thus given the blanket statement that the eye can see more than essentially any FPS is untrue because the eye doesn't work like that. My apologies for disrupting the conversation it's just a personal pet peeve being both a gamer and in the field.
July 13, 2012 7:28:19 PM

Agreed. Makes me a little sad. I bought my reference 680 the day they came out, for $500. Now, you can get that FTW 670 for $400, matching the performance I get. Thanks for making early adopters feel great about their purchase Nvidia!

Not really mad though. I knew something like this would happen when I bought it. I just didn't expect it to be a product released by Nvidia like a month later, competing with their own product that's already out lol.
a b Î Nvidia
July 13, 2012 7:53:08 PM

cepheid said:
The best frame rates are those that match-up with the firing of your optic nerve's action potentials. You essentially get on on the right wavelength and frequency and you see better. You might see better at 85fps or 65fps than at 120fps. This is why people get headaches after long hours of gaming. It's also why people buy those stupid glasses that are designed to mitigate the frequency. Receiving information, due to the refractory period after one of your neurons firing (rods, cones, bipolar cells, etc), you cannot retain any information cognitively over approx 55-65 and anything above that is simply a crap shoot on seeing better which is based on environment. So no, the 120mhz monitor really doesn't make a difference. What you "feel" is better that you have a 120mhz unless it's specifically synced with lighting and environmental it won't provide a better anything. But that's just how the fun world of science sees it, maybe they'll prove that wrong in the future. Until then by all means embrace the beauty of the 120 mhz monitor.


you're correct that you can't retain information beyond that rate. however keep in mind that images rendered by a GPU are sharp and without motion blurr. essentially, when you move beyond 60fps or so on a monitor, the motions will seem more fluid and "natural. after all, all the motion we see in daily life are not a fast moving "flip book", but fluid, connected motion. also, just because we're stuck seeing 60fps or so, doesn't mean that we cannot benefit from a higher frame-rate from a game in intensive first person shooters. having a higher frame rate still gives you the possibility of see the enemy "earlier", especially if you're well practiced and actively exerting yourself. keep in mind the absolute refractory period of neurons is significantly shorter than the relativ refractory period, and professional players react to motion on the screen closer to the order of pure reflex than actual cognitive reaction to optical stimuli.
July 13, 2012 8:35:09 PM

vmem said:
you're correct that you can't retain information beyond that rate. however keep in mind that images rendered by a GPU are sharp and without motion blurr. essentially, when you move beyond 60fps or so on a monitor, the motions will seem more fluid and "natural. after all, all the motion we see in daily life are not a fast moving "flip book", but fluid, connected motion. also, just because we're stuck seeing 60fps or so, doesn't mean that we cannot benefit from a higher frame-rate from a game in intensive first person shooters. having a higher frame rate still gives you the possibility of see the enemy "earlier", especially if you're well practiced and actively exerting yourself. keep in mind the absolute refractory period of neurons is significantly shorter than the relativ refractory period, and professional players react to motion on the screen closer to the order of pure reflex than actual cognitive reaction to optical stimuli.


As I've said, the human eye does not see in FPS. Doesn't matter how "trained" you think you are. You have as equal a chance of seeing the enemy "earlier" as the other guy as it depends on wave frequency and light matching with optic potentials. My speculation about popular fallacies is the idea that because it's a higher frequency that it must be more likely to hit the optic nerve at exactly the right time. That isn't true. Also the advantage most gamers have regarding their reaction time is in relation to the bipolar cells and contrast mostly. They have an "eye" for setting up their systems to provide high contrast gaming.

The benefits to gaming above 60 (and I mean just above 70-80 at best and you'd have to be playing on a huge ******* screen) is the likelihood that you can catch peripheral information from the sides of your eyes. That certainly helps gamers. But once again once you get out of a certain frequency it doesn't matter as much. You catch peripheral when hits the cornea and lens because of the refraction. A gamer with excellent control of his reflexes can make use of that. I'd love to think that 120mhz is amazing and I'm wrong but all information I've seen contrasts with the testimonies and praises of the 120mhz monitor. You'd also still have environmental factors unless you're goggling. Maybe if I feng shui my office I'll get it. :D 

I'm not saying don't get one. I'm not saying don't enjoy what you have. I'm simply giving you what I know about the human eye. I am also the proud owner of a 120Mhz monitor. I could also been wrong as I have been many times before.
a c 80 Î Nvidia
July 13, 2012 9:08:47 PM

cepheid said:
As I've said, the human eye does not see in FPS. Doesn't matter how "trained" you think you are. You have as equal a chance of seeing the enemy "earlier" as the other guy as it depends on wave frequency and light matching with optic potentials. My speculation about popular fallacies is the idea that because it's a higher frequency that it must be more likely to hit the optic nerve at exactly the right time. That isn't true. Also the advantage most gamers have regarding their reaction time is in relation to the bipolar cells and contrast mostly. They have an "eye" for setting up their systems to provide high contrast gaming.

The benefits to gaming above 60 (and I mean just above 70-80 at best and you'd have to be playing on a huge ******* screen) is the likelihood that you can catch peripheral information from the sides of your eyes. That certainly helps gamers. But once again once you get out of a certain frequency it doesn't matter as much. You catch peripheral when hits the cornea and lens because of the refraction. A gamer with excellent control of his reflexes can make use of that. I'd love to think that 120mhz is amazing and I'm wrong but all information I've seen contrasts with the testimonies and praises of the 120mhz monitor. You'd also still have environmental factors unless you're goggling. Maybe if I feng shui my office I'll get it. :D 

I'm not saying don't get one. I'm not saying don't enjoy what you have. I'm simply giving you what I know about the human eye. I am also the proud owner of a 120Mhz monitor. I could also been wrong as I have been many times before.


As someone with a 120hz monitor, I see 3 benefits from this:
1) Latency. The time for me to move the mouse and have it respond on the screen is reduced. It's not something you see, but something I feel. At 60 FPS and below, I get motion sickness if playing for more than 30 mins. At 80+ FPS, motion sickness goes away.
2) Tracking targets is smoother. When you turn, you notice a major difference in smoothness between 60hz and 120hz if you have the FPS to back it up.
3) 3D gaming, which isn't an issue here.

After getting used to 120hz, I also notice that 60 FPS looks stuttery. It's not horrible, but it is something I notice, yet I did not feel that way before I got used to 120hz. When ever a game or setting gets messed up on a game that drops my fps into the 60's or below, I notice it now and know to go look at my settings. Often time it is a result of my CPU losing it's OC.
a c 332 Î Nvidia
July 13, 2012 9:24:18 PM

My take from these articles (below) is that higher FPS is better, particularly for someone who's perception has been honed and acclimated by high-FPS first-person shooter games. The human eye can perceive much more than we give it credit and the issue is complex. There is no standard "here is the most frames per second we can perceive".

In addition to the issue of "seeing" 120 frames per second, the video card and the PC itself is also running at a higher level of responsiveness at 120+ FPS than at 60 FPS. Regardless of our ability to visually perceive it, games play faster and controls respond more quickly when FPS is maxed out.
http://www.tweakguides.com/Graphics_5.html
http://amo.net/NT/02-21-01FPS.html
http://www.100fps.com/how_many_frames_can_humans_see.ht...
July 13, 2012 9:53:20 PM

17seconds said:
My take from these articles (below) is that higher FPS is better, particularly for someone who's perception has been honed and acclimated by high-FPS first-person shooter games. The human eye can perceive much more than we give it credit and the issue is complex. There is no standard "here is the most frames per second we can perceive".

In addition to the issue of "seeing" 120 frames per second, the video card and the PC itself is also running at a higher level of responsiveness at 120+ FPS than at 60 FPS. Regardless of our ability to visually perceive it, games play faster and controls respond more quickly when FPS is maxed out.
http://www.tweakguides.com/Graphics_5.html
http://amo.net/NT/02-21-01FPS.html
http://www.100fps.com/how_many_frames_can_humans_see.ht...


There used to be engines that ran faster based on FPS (unreal) but those days are long gone. How does higher FPS correlate with faster a faster game? The amo.net guy seems to sort of get it but he in know way explained how 120 fps was possible he simply gave some information on cones and rods and told what frequency was. Yes, please ask a neuropsychologist or go on google scholar and hit up the journals.

"Even though single eye cells (rods and cones) may have their limitations due to their chemical reaction times and due to the distance to the brain, you cannot be sure how they interact or complement or synchronize. If 1 cell is able to perceive 10fps, 2 cells may be able to perceive 20fps by complementing one another. So don't confuse "The human eye" with "The cell".

Except we do have a good idea of how they work as we can map the activity on a cellular level as well as a macro level. The rest is math. We also know that as you are using a cell it diminishes the amount of information you can retain as you lose surface area. The eye has a finite amount of cells. We do know that specific wavelengths stimulate parvocellular cells but, as I said it is as likely to be at 70mhz as 120mhz. This is why some people get headaches from certain refresh rates (lower than 50 it's bound to happen). Theirs been no evidence of a blind test in which their was a reported significant difference between 75mhz-120mhz. I believe the last bout of cognition tests gave approx. 50 for information retention. Whether we know about the actual cell is not as we know the rate of the action potentials and refractory periods. The fact that light is being focused. You say there's a difference and I say there's not enough significant evidence of it. These articles essentially explain exactly what I typed earlier without any conclusive or even rational conclusions other than the fact that they don't know. You can argue that because a theory hasn't been proved it's possible and you'd be right but that doesn't mean people haven't already started researching it and gathered data that pretty much points to the contrary. I agree though, go ask an optometrist or a cognitive psychologist/neuro-scientist.


**When the guy from I think Amo says we don't know how the brain processes visual information in V1, optic tectum or any other part of the brain he's pretty much wrong. While there are many things not known about the human brain there's quite a bit of work that's been done on visual stimuli.
a c 80 Î Nvidia
July 13, 2012 9:59:34 PM

cepheid said:
There used to be engines that ran faster based on FPS (unreal) but those days are long gone. How does higher FPS correlate with faster a faster game? The amo.net guy seems to sort of get it but he in know way explained how 120 fps was possible he simply gave some information on cones and rods and told what frequency was. Yes, please ask a neuropsychologist or go on google scholar and hit up the journals.

"Even though single eye cells (rods and cones) may have their limitations due to their chemical reaction times and due to the distance to the brain, you cannot be sure how they interact or complement or synchronize. If 1 cell is able to perceive 10fps, 2 cells may be able to perceive 20fps by complementing one another. So don't confuse "The human eye" with "The cell".

Except we do have a good idea of how they work as we can map the activity on a cellular level. We also know that as you are using a cell it diminishes the amount of information you can retain as you lose surface area. The eye has a finite amount of cells. We do know that specific wavelengths stimulate parvocellular cells but, as I said it is as likely to be at 70mhz as 120mhz. This is why some people get headaches from certain refresh rates (lower than 50 it's bound to happen). Theirs been no evidence of a blind test in which their was a reported significant difference between 75mhz-120mhz. Whether we know about the actual cell is not as we know the rate of the action potentials and refractory periods. The fact that light is being focused. You say there's a difference and I say there's not enough significant evidence of it. These articles essentially explain exactly what I typed earlier without any conclusive or even rational conclusions other than the fact that they don't know. You can argue that because a theory hasn't been proved it's possible and you'd be right but that doesn't mean people haven't already started researching it and gathered data that pretty much points to the contrary. I agree though, go ask an optometrist or a cognitive psychologist/neuroscientist.


Here is something for you to consider. When you watch a move or even a cut-scene in a game that has no motion blur, you'll find that 30FPS feels smooth and will give no one a headache as far as I know. However, playing a game at 30 FPS will give many people simulator sickness, which takes the form of nausea or headaches. Visually, there is no difference, but the interactive aspect of gaming makes 30 FPS feel unplayable. This is due to latency. Your body/mind will notice that your movements of the mouse have a slight delay with what it sees on the screen, and that will cause issues for many people.

This is something that is not considered when it comes to 120hz gaming, and it has transformed gaming for me. I used to get sick gaming, I do not now. Others get headaches, I get nausea, but it comes from the same issue; latency.

Edit: and don't forget, the mind doesn't make images. It monitors changes. When there isn't a lot of fast motion, low FPS are needed to feel smooth, but even at 60hz, if you turn 360 degrees fast, you will see a choppy scene, but at 120 FPS and hz, it does look smoother. Try your favorite game, and turn around 360 degrees and tell me if you notice individual images, and not a smooth panning.
July 13, 2012 10:09:07 PM

bystander said:
Here is something for you to consider. When you watch a move or even a cut-scene in a game that has no motion blur, you'll find that 30FPS feels smooth and will give no one a headache as far as I know. However, playing a game at 30 FPS will give many people simulator sickness, which takes the form of nausea or headaches. Visually, there is no difference, but the interactive aspect of gaming makes 30 FPS feel unplayable. This is due to latency. Your body/mind will notice that your movements of the mouse have a slight delay with what it sees on the screen, and that will cause issues for many people.

This is something that is not considered when it comes to 120hz gaming, and it has transformed gaming for me. I used to get sick gaming, I do not now. Others get headaches, I get nausea, but it comes from the same issue; latency.


24fps movie is fine but 50 Mhz is likely to make you ill? Yes, that's what I've been trying to explain, it's not about FPS.

Yes, some people act poorly to certain wavelengths especially at low levels. It happens more at low levels because it's very noticeable on both a cognitive level and a visual level. Now outside of that even above 60fps people have to find the right frequency for themselves for their monitor. Some people don't like 75hz some people get sick at 120mhz. The point is it's not the frame rate, it's the frequency that makes the difference. Your eyes may prefer 110hz-120 and maybe they would also prefer a ratio of [K] down to 80hz. If that matches with biology then you would be fine at either of those frame rate. Now because essentially anything over 60fps is non-cognitive you shouldn't notice a difference so long as you found something you like. I'm not saying some people don't love 120hz I'm saying they don't love it for the right reasons. It's not about the FPS and the sooner people realize that the better they can design a system for themselves.

If it works for you go with it. I'm not anti-better technology and sweet gameplay, I'm just anti-people not understanding. I'm saying 120mhz is not a magical number that suddenly makes your gaming better.
a c 80 Î Nvidia
July 13, 2012 10:14:30 PM

cepheid said:
Yes, some people act poorly to certain wavelengths especially at low levels. It happens more at low levels because it's very noticeable on both a cognitive level and a visual level. Now outside of that even above 60fps people have to find the right frequency for themselves. Some people don't like 75mhz some people get sick at 120mhz. The point is it's not the frame rate, it's the frequency that makes the difference. Your eyes may prefer 110mhz-120 and maybe they would also prefer a ratio of [K] down to 80fps. If that matches with biology then you would be fine at either of those frame rate. Now because essentially anything over 60fps is non-cognitive you shouldn't notice a difference so long as you found something you like. I'm not saying some people don't love 120 mhz I'm saying they don't love it for the right reasons. It's not about the FPS and the sooner people realize that the better they can design a system for themselves.

If it works for you go with it. I'm not anti-better technology and sweet gameplay, I'm just anti-people not understanding.


Now I call BS on this. Every article I've read on the subject has made is very clear that we do not see in frames. We just notice changes.

Now try to explain this in your wavelength theory.

Watch a cut-scene in a game that runs at 30 FPS (some run at 50, some at 60, so you can try different games for different experiences). I personally will never feel discomfort from any cut-scene or movie unless it drops below 24hz.

The moment you add mouse controls, all bets are off. The same fluid scene, when I control the movement, become a nightmare to my stomach. It's the same "wave length", yet it's very different when you control it. This is latency between movement, and how fast it gets drawn to the screen, not wave lengths.
July 13, 2012 10:21:14 PM

bystander said:
Now I call BS on this. Every article I've read on the subject has made is very clear that we do not see in frames. We just notice changes.

Now try to explain this in your wavelength theory.

Watch a cut-scene in a game that runs at 30 FPS (some run at 50, some at 60, so you can try different games for different experiences). I personally will never feel discomfort from any cut-scene or movie unless it drops below 24hz.

The moment you add mouse controls, all bets are off. The same fluid scene, when I control the movement, because a nightmare to my stomach. It's the same "wave length", yet it's very different when you control it. This is latency between movement, and how fast it gets drawn to the screen.


Ok, now go back and read what I wrote. We DO NOT see in FPS, we see wavelengths of light. Once again you're talking below the 60-65fps mark. FPS really only relates to retention of information on a conscious level. What I'm talking about is sensory information. Your monitor may get 120fps and only be 85Hz. The FPS is completely irrelevant to the salience of sensory information and doesn't matter above the approx. 60fps mark. The frequency is what effects your sensory information and is not based on fps. So the 120fps monitor is irrelevant as test have shown the human eye doesn't retain cognitive information above approx. 60fps. Sensory information can retain information (stimulation) but it's soley based on the frequency, not the fps.

Not all information is alike, not all nerves/neurons are alike.
a c 80 Î Nvidia
July 13, 2012 10:29:36 PM

cepheid said:
Ok, now go back and read what I wrote. We DO NOT see in FPS, we see wavelengths of light. Once again you're talking below the 60-65 mark. FPS really only relates to retention of information on a conscious level. What I'm talking about is sensory information. Your monitor may get 120fps and only be 85Hz. The FPS is completely irrelevant to the salience of sensory information and doesn't matter above the approx. 60fps mark.


For starters, these wave lengths you are referring to have no correlation to FPS. It has to do with what light spectrum we can see. Many animals have different spectrums of light that is visible to them due to the wave lengths their eyes pick up. This has nothing to do with motion sickness issues people have. Most people who have these issues, have it due to latency, the time between moving, and how fast it is displayed. If too slow, our bodies will react with headaches or nausea.

Now again, the amount of frames we notice has more to do with how much motion is happening. In many cases, you are right, 60 FPS may be all most people can perceive, but fast movement changes that. Test have shown we can see differences even at 1000 FPS if there is contrast and fast motion. It is all relative.

If you use a 120hz, you may understand.
July 13, 2012 10:33:10 PM

bystander said:
For starters, these wave lengths you are referring to have no correlation to FPS. It has to do with what light spectrum we can see. Many animals have different spectrums of light that is visible to them due to the wave lengths their eyes pick up. This has nothing to do with motion sickness issues people have. Most people who have these issues, have it due to latency, the time between moving, and how fast it is displayed. If too slow, our bodies will react with headaches or nausea.

Now again, the amount of frames we notice has more to do with how much motion is happening. In many cases, you are right, 60 FPS may be all most people can perceive, but fast movement changes that. Test have shown we can see differences even at 1000 FPS if there is contrast and fast motion. It is all relative.

If you use a 120hz, you may understand.


Ok, if you say so but I believe you're wrong and have read numerous journals on visual information retention. It's not all relative. The body is not an infinite machine that can do anything. Any community college cognitive psychology teacher could tell you that the research does not indicate your above statement. Hit up some google scholar or your school library and I'm sure you can read up on it.
July 13, 2012 10:45:51 PM

Its funny, people try to argue "you cant see 120fps why buy it"

Once you reach 120frames or greater its not about what your eye sees but about what the computer detects.

Example, You clone yourself/your pc/your resolution/game play style. Everything is exactly identical. The only variable is you are running at 120fps and your clone is running at 60fps. Why do you have a clear advantage over the 60fps clone? Because your eyes can't see the extra fps? Like that makes sense.

No one that buys 120fps for FPS gaming expects to "see" an improvement. The only difference detectable is what the CPU/GPU see at 120fps vs 60fps.

Same thing can be said for 200+ fps vs 120fps. Clear advantages.

Example time again, Black Ops at 60fps vs Black Ops at 120fps, Black Ops at 120fps is much funner to play because I feel like I have an easier time killing people because I do. Simple. Even without a 120hz monitor, running the game at 120fps makes a difference. Something I have noted for years now with many fps games.

People will argue "You dont see it so its not there" or "your just going off what you feel so its not true."

Well, I guess then this whole thread was for nothing, you are right we are wrong goodbye move on with your life while we stay caring about getting an imaginary fps to get imaginary improvements in FPS gaming.

Am I dreaming or is everyone as sheltered as cepheid?

How is it only 3 people in this thread think that 120fps gaming makes a difference, its been atleast 7 years since I have known this and I still can't find a large following. Probably because most people can't afford systems that can run that high and maybe because we have actually had the chance to test it.

Have you had a chance to test any 120fps vs 60fps gaming? Or to a further extent 200+fps vs 120fps vs 60fps vs 30fps?

Im sorry but just because you haven't ever heard of or seen for yourself, doesn't make you correct and everyone else wrong, infact it makes you blind to the truth.

Truth being that 120fps is the future for professional gamers. 60fps is more than enough for your casual FPS gamers but true dedicated gamers aim for higher than 60fps, not for the sake of their vision, for the sake of their gaming.
July 13, 2012 10:52:20 PM

akamrcrack said:
Its funny, people try to argue "you cant see 120fps why buy it"

Once you reach 120frames or greater its not about what your eye sees but about what the computer detects.

Example, You clone yourself/your pc/your resolution/game play style. Everything is exactly identical. The only variable is you are running at 120fps and your clone is running at 60fps. Why do you have a clear advantage over the 60fps clone? Because your eyes can't see the extra fps? Like that makes sense.

No one that buys 120fps for FPS gaming expects to "see" an improvement. The only difference detectable is what the CPU/GPU see at 120fps vs 60fps.

Same thing can be said for 200+ fps vs 120fps. Clear advantages.

Example time again, Black Ops at 60fps vs Black Ops at 120fps, Black Ops at 120fps is much funner to play because I feel like I have an easier time killing people because I do. Simple. Even without a 120hz monitor, running the game at 120fps makes a difference. Something I have noted for years now with many fps games.

People will argue "You dont see it so its not there" or "your just going off what you feel so its not true."

Well, I guess then this whole thread was for nothing, you are right we are wrong goodbye move on with your life while we stay caring about getting an imaginary fps to get imaginary improvements in FPS gaming.

Am I dreaming or is everyone as sheltered as cepheid?

How is it only 2 people in this thread (me and OP) think that 120fps gaming makes a difference. Hmm maybe because we have actually had the chance to test it.

Have you had a chance to test any 120fps vs 60fps gaming? Or to a further extent 200+fps vs 120fps vs 60fps vs 30fps?

Im sorry but just because you haven't ever heard of or seen for yourself, doesn't make you correct and everyone else wrong, infact it makes you blind to the truth.

Truth being that 120fps is the future for professional gamers. 60fps is more than enough for your casual FPS gamers but true dedicated gamers aim for higher than 60fps, not for the sake of their vision, for the sake of their gaming.


I do have a 120 monitor running at 75Hz. People think that FPS made a gaming difference because of the first unreal engine. That's not the case with current games. You're jumping to conclusions that are false. By sheltered you mean participated in hands on research on the subject? I'm going to assume that you research like you read threads meaning not at all. Many people have come into the forums and expressed their enjoyment of 120fps monitors and I'm sorry you feel the need to try and bring this to a character debate over my upbringing so I'll be stepping out of the conversation now.
July 13, 2012 11:12:44 PM

cepheid said:
I do have a 120 monitor running at 85Hz. People think that FPS made a gaming difference because of the first unreal engine. That's not the case with current games. You're jumping to conclusions that are false. By sheltered you mean participated in hands on research on the subject? I'm going to assume that you research like you read threads meaning not at all. Many people have come into the forums and expressed their enjoyment of 120fps monitors and I'm sorry you feel the need to try and bring this to a character debate over my upbringing so I'll be stepping out of the conversation now.


Running it at 85hz doesn't equal 120fps.

Which is what my arguement is about. 120 frames per second.


Yes 120hz monitors are nice, but if you aren't running it at 120hz for First Person Shooter gaming then you are doing it wrong.
a c 80 Î Nvidia
July 13, 2012 11:49:30 PM

cepheid said:
I do have a 120 monitor running at 85Hz. People think that FPS made a gaming difference because of the first unreal engine. That's not the case with current games. You're jumping to conclusions that are false. By sheltered you mean participated in hands on research on the subject? I'm going to assume that you research like you read threads meaning not at all. Many people have come into the forums and expressed their enjoyment of 120fps monitors and I'm sorry you feel the need to try and bring this to a character debate over my upbringing so I'll be stepping out of the conversation now.


The fact that at 30 FPS I get motion sickness instantly, and as it goes up to 40 FPS, I can handle a few minutes before it sets in, and if it's at 50 FPS, I can last maybe 20 mins and up to 30 mins before getting sick with 60 FPS, and as I progress up to 80-90 FPS, the motion sickness completely goes away, this tells me that there is a difference. I also do not like screen tearing, which means I need a 120hz monitor to get to a point where I do not experience motion sickness.

One telling thing is that I experience 0 motion sickness if I'm not controlling the what is happening on the screen. It only happens if I'm controlling the game. This is why it has to be the latency, and not the actual frequency.

Whether I notice a difference in smoothness or not, the nausea is very apparent. And while I didn't notice a big difference when I first got the 120hz monitor in terms of smoothness. I instantly notice when my monitor goes into 60hz mode now. I assuming your body can get used to one way, and it notices change.

This wave length stuff you are talking about is about what colors we see. Due to the range of wave lengths we can see, we can't see things like infrared or ultraviolet, while other animals/insects can. It has nothing to do with motion sickness.

And while playing an isometric game, you won't notice much difference, or if you are playing a slow paced first person game, where you don't turn much, you may not notice much difference, but as soon as you make fast motions, you do.
July 14, 2012 12:43:21 AM

I would think progressive or interlaced framing be the cause of motion sickness though I'm not sure if interlaced is still being used. I would think motion sickness would have more to do with progressive framing. You could probably run at 60hz with interlacing and still be perfectly fine. It's not just color I'm talking about. Lateral inhibition is a bipolar cell function unrelated to color and rods have little to do with color.
a c 80 Î Nvidia
July 14, 2012 12:50:11 AM

cepheid said:
I would think progressive or interlaced framing be the cause of motion sickness. I would think motion sickness would have more to do with progressive framing. You could probably run at 60hz with interlacing and still be perfectly fine. It's not just color I'm talking about. Lateral inhibition is a bipolar cell function unrelated to color and rods have little to do with color.


How on earth could you come to that conclusion? Just to put your mind at ease. Interlaced is just horrible, all the time, except in the case of movies on a TV at distance.

It is progressively worse the lower the fps goes. This tells me it is directly related to FPS, and not a specific frequency. Do you know what also increases when FPS go down? Latency between your movements and what you see.

And in case you haven't read this bit of information yet. Unless I am controlling the action on the screen with a mouse, there is no motion sickness ever. It is only when I control the the action that motion sickness becomes part of the equation.

There is only 1 thing in this equation that can cause that. Latency.
a b Î Nvidia
July 14, 2012 12:50:38 AM

cepheid said:
I would think progressive or interlaced framing be the cause of motion sickness though I'm not sure if interlaced is still being used. I would think motion sickness would have more to do with progressive framing. You could probably run at 60hz with interlacing and still be perfectly fine. It's not just color I'm talking about. Lateral inhibition is a bipolar cell function unrelated to color and rods have little to do with color.


someday when we finally replace the optical nerve with optical fibers, we'll be able to "see" in 1000fps :bounce:  though our brains may not be able to comprehend the full glory of it :D 
July 14, 2012 12:58:29 AM

Just get rid of the eye now, bitches love the eye patch. :sol:  Sadly bionic eyes with optic fibers will probably be around before my condo gets FiOs.
July 14, 2012 1:44:46 AM

If you want a more powerful setup, then think of SLI. If you SLI a couple of 670's you are in a fairly futureproof situation.
If you don't like SLI then go for the 680. They can easily overclock with Afterburner.
I have the Evga GTX670 FTW. I gave it a slight overclock to make sure it was going to run at 60FPS. It will do up to 120 FPS with VSync turned off in Skyrim. No tearing at all but the wildlife would bounce and spawn in the sky, so I turned it back on. My monitor only does 60Hz so it is good for me. BF3 does about 50 to 60 FPS also.
July 28, 2012 7:18:57 PM

Best answer selected by devjeetroy.
September 21, 2012 1:41:50 PM

cepheid said:
Ok, well you may think you can but neuroscience says otherwise. Eat all the placebos you want hombre.



Hmm, there is no need to start name calling here, I can also vouch for this and there is an easy test to prove that you can see more than 60hz.

Take any old CRT monitor and set it to 50 hz, does it flicker visibly? Yes, now look above the monitor slightly does it flicker even more irritating? yes.

The reason for this is because the eye has more light sensitive receptors around the edges and more colour receptors in the middle.
If you go out into the dark at night you can actually see more detail in the dark in the corners of your eyes, and as you look at them you might not see something that you have seen (from the corner). However the middle of the eye can still see over 65hz.

Now switch the monitor to 85hz, and the flicker gets less. Almost not perceivable. And to many it is perfect. Trained eye however can pick it up, and long term use on those old CRTs even at 85hz caused eye and head ache. Quite perceivable if you look above the monitor. Change Over 100 hz it becomes very hard to notice.
I still manage to tell the difference between 100 and 120 hz but it becomes tricky over 100.

I used to do this all the time in the old days, when I used Large Professional monitors for development. To the point that I would immediately notice 85hz monitors when I had to use them.

LCDs don't turn black during each frame refresh, and the flicker is normally non existent at 60hz there is reasonably smooth motion but over 60 it will definitely make a smoothness diference provided your graphics card can handle it, and b) your monitors pixel refresh is fast enough. Often pixels draw miniscule little trails even these days, and perhaps it would detract from the value of a 120hz monitor.

Also if you did physics at university, then you may have been shown an experiment that uses a strobe light apparatus that lights up the room for millionth of a second.
Change the frequency to around 50 times a second and the room seems completely lit with a hint of irritating flicker. Change it to over a 100 and the flicker becomes practically imperceptible. Your 65hz neural path theory needs citation, and revision.
!