DePlane

Honorable
Dec 2, 2012
33
0
10,530
I am getting the EVGA GTX 690, which will play any game at a steady 120 FPS (On Ultra/High settings) but if I'm only using a 60Hz monitor, I won't visually see 120 FPS because the monitor is only 60Hz? If that is correct, why buy such an expensive GPU if a GTX 670 can do a steady 60-70?

So, to get the most for my eyes I'll need a 120Hz monitor? But since all of them come in 3D, does that mean that I'm only getting 120Hz while in 3D and only 60 for 2D?

I'm probably totally wrong but feel free to correct me and suggest options best for me. Thanks.
 


Correct. A 60hz display will only refresh 60 times per second regardless of how many frames are drawn internally to the framebuffer. A GTX 670 will handle most games (most, not all) at 1080p while maintaining 50-60 frames per second. At resolutions above 1080p such as 2560x1440 and 2560x1600 you will need two GPUs, or a dual-GPU card to play some very intensive games such as Crysis 2 and Metro 2033 at their maximum detail settings.
 

samuelspark

Distinguished
Sep 12, 2011
2,477
0
19,960


The way some 3D monitors work is that they split the FPS in half, one for each eye (which is actually two frames) to create that 3D effect. Of course, you can turn it off when you don't need it. The main reason people get a 690 is to futureproof, run multiple monitors, and play at high resolutions.
 
your getting frames per minute mixed up with screen refreshing. on old crt the 60hrzs was the speed the crt refreshed per min. some people can see the flickering of screen and lightbulb blinking.
http://www.vegasledscreens.com/faq/80-led-screens-and-refresh-rate.html
the faster the screen refresh the less blinking some people see.
frames per sec in the number of frames per minute in a game or movie you see at theater or on your computer.
http://en.wikipedia.org/wiki/Frame_rate
most movie and min game is 30 frames a sec.