is this REALLY necessary?

Joshua_81

Honorable
May 18, 2016
76
0
10,630
Is 144hz REALLY necessary?

Because i use 66hz (Max OC) and it looks fine!

I need to know is there is a jaw dropping difference between 60/66hz and 144..

or is it rly just kinda "Eh not so good".
 

SoraDaXplorer

Commendable
Sep 12, 2016
31
0
1,540


Do you think a gtx 1060 is worth for a 144hz 1080p monitor?
 

SoraDaXplorer

Commendable
Sep 12, 2016
31
0
1,540


Well i already have the 1060 and the 144hz monitor i'm having it's only 120$ so i guess good deal xd in the future im planning upgrading my pc anyway
 

cilliers

Honorable
Jul 13, 2012
825
0
11,360
Your question is extremely relevant and perhaps one of the hottest topics lately. It is also a very subjective matter at the moment, because some gamers claim that 30 fps is "enough" for them and others can't wait to go beyond the current maximum of consumer grade 144 fps monitors. Lets discuss some boring theory first:

From a physiological point of view, the theoretical rate at which humans can visually perceive their surroundings is about 1000 hz. This is the maximum speed at which human nerve impulses can send information to the brain. However, most humans operate at 300-400 hz on average. So one could argue that there would never be any reason for screens to go beyond 1000 Hz (1 kHz). Having said that, remember that the computer hardware would have to be capable of rendering the graphics at 1000 fps or more for you to benefit from a 1 kHz screen. So these 2 technologies would have to develop in parallel. Ref: http://www.noteaccess.com/APPROACHES/ArtEd/ChildDev/1cNeurons.htm

In taking a stab at your original question, lets discuss some background: I've been gaming since 1997. Then we had horrible CRT screens that typically operated at 60 Hz and doubled as a heater in your room. They gave us headaches, because after a CRT finishes firing at every pixel on the screen space, it has to blank, before it starts again, ergo, "flicker". LED's and LCD's don't have to do this blanking, so that's why they are easier on the eyes. Back in 1998 the fps of games was starting to become an important metric because of releases such as quake 2 on OpenGL, Direct3D and 3DFX. We were typically gaming with 60 hz CRTs at 60 fps if you were lucky enough to have a 3D card. This was insanely cool and indeed a technological revolution in gaming. I would say that most gaming PCs were practically limited to 60 fps because of CRT limitations. Ref: http://img.tomshardware.com/us/1998/09/02/voodoo/v2-q2-640.jpg

Later (2002), I bought a Viewsonic E70f+ monitor. It was still CRT, but supported higher refresh rates of up to 75 hz @ 1152 x 768. This was just fine for a couple of years, because at this stage (2002 to 2004), competition between GPU manufacturers was at peaking and GPUs were generally 2x overpowered for the games we had to run on them... until the release of Doom 3 and the Original Crysis, which melted PCs. So for a long time, I was actually using the E70f+, because of the growing pains that CRT was going through. Also, the E70f+ had overbright, which actually took a lot of jaggies out of games. The CRTs of the time had horrible pixel density and you had to take large performance hits, by enabling AA and AF to make the CRT look comparable. So, until 2009, I was gaming on the E70f+ for said reasons and it was awesome, because I had true 75 fps / 75hz, compared to my CRT mates that were mostly stuck at 60hz.

So in my very subjective opinion, 60hz LED's are just fine for my eyes, when the game is running at 60 fps. It doesn't make financial sense for me to spend the extra money on a faster refreshing monitor and a GPU that can support it. I've gamed on 60hz, 75hz, 120hz and 144hz. In my honest opinion, the difference between 75hz and 144hz isn't half as dramatic as the difference between 30hz and 60hz, however, I must admit that gaming at true 144 fps / 144hz is like a vacation for your eyes.