1440p vs 1600p HELP!!!!!!

Dropdead777

Honorable
Apr 28, 2013
82
0
10,640
Ok I stuck on either an YAMAKASI 301 SPARTA 30" S-IPS 2560x1600 or an QNIX QX2710 LED Evolution Ⅱ 2 27" 2560x1440. I have 2 gtx 780ti in sli so I know the qnix can overclock to at least 96hz if not 120hz....but the 30 inch is a 16:10 and the qnix is 16:9 which would be better for games??? The 30 inch will not overclock at all but really dont care just want the best image quality and smoothness in games???
 
Solution

maxiim

Distinguished
Oct 28, 2009
957
0
19,360
More frames to render in the 1600p less in 1440p, I personally would take the 27 inch 1440, as it does actually have higher PPI(pixel per inch) which makes things look clearer and sharper. The 27 inch as you said will have a higher refresh rate which would make gaming look smother with 2 780tis.
 
I would go with the 1600p display.

Its debatable if humans can see more 30FPS, and no one thinks we can see 60FPS or more. We only pick up glimpses of those 60FPS. Anything past that shouldn't be noticeable to you, so I would say go with the basic idea that higher resolution=higher quality.

In truth the sharpness of the image probably won't change because screen is larger so its still like the same number of pixels per inch really, but at least its a bigger display which is always nice.
 
It's not debatable at all if humans can see more than 30fps. People don't see in frames and can easily distinguish differences in frame rates to 100fps. If you want to get more detailed, there have been studies that prove we can actually see a flash of light that only lasts 1/200 of a second but your brain is getting constant visual info and will perceive even higher fps without you consciously noticing.
 


Yes it is debatable. That is why US TVs run at 24FPS while British and most European TVs run at 30FPS. Those weren't numbers they just picked at random, they attempted to determine the point at which it becomes unnoticeable and set that as the refresh target.

A test showing we can detected a flash of light at 1/200 of a second doesn't prove we can see 100FPS. Sure we might be able to notice a change in light, but if you get a device to display 200 frames of a film in a single second they will not be able to tell you what was happening in the vast majority of them. So just because we are able to detected subtle changes does not prove we can pick up and be able to register dynamic changes in a similar set of images at that rate.
 

maxiim

Distinguished
Oct 28, 2009
957
0
19,360


Whats your point exactly? The 27inch OCed display will deliver a more fluid and smooth gameplay with his GPUs, no doubt about it. Getting the bigger display with lower PPI just increases the blockiness and need for AA.

 


My point was in my original post. There is plenty of doubt as to if the 27 inch monitor will look more fluid. If we aren't able to see more than 30 frames per second and notice them fully, then there is a time when regardless of how high the refresh rate we won't be able to notice any difference in appearance. That might not happen till 60, 96, 120, or higher but its possible to happen and therefore raises doubt.

And you are talking blockiness on a super high definition display? Unless he is playing minecraft he isn't going to see blockiness. Not to mention with only an increase from 27 to 30 inches, with a minor increase in resolution, the number of pixels per inch are going to be fairly close.

If the increase in refresh rate and slightly more pixels per inch aren't noticeable, then the better display would be the one that lasts longer, with the higher overall resolution, and the largest screen.
 

maxiim

Distinguished
Oct 28, 2009
957
0
19,360


Obviously you've never used a 24inch 1080p next to a 27 inch 1080p display for example, if you have you'd know what I'm talking about. As for me I personally use a PB278Q, and I've had the opportunity to compare it next to the Dell U3014, it makes a difference, pixel density also makes a difference when looking at a display. As far as PPI being fairly close, my 24 inch 1080p secondary display is at 91ppi, where as my 27 inch 1440p display is at 108ppi and its a major difference, the 30inch will sit in the middle with 100ppi, it will be noticeable. Even without looking at the refresh rate of each display the 27 inch will still look better.
 
Solution
You mixed up "when you can notice" vs the "max" we can see. Yes 30 fps has been standardized as the "minimum" before stuttering from too less fps is distracting. But this is the MINIMUM NOT MAXIMUM and we can distinguish watching 120fps vs a 60fps video. (Actually 30 is the standard because the first games were mostly made in the US and we use 30 not 24 fps. So much for determining the actual minimum scientifically.) I really have to suggest you see the difference yourself especially with higher hz tvs being the norm these days. Go to a store and look at a 120hz display next to a 60hz. This isn't the best option because interpolation video quality but the difference is easily noticeable nonetheless.

You also have tv fps wrong. These fps were originally due to the power grid and crt and we just don't have a reason to change standards. US broadcast tv is NTSC 30 fps (the power grid is 60hz) and most of Europe is PAL 25fps (where their power is50hz). 24fps was the original standard for film, you know with actual film strips and projectors, and was actually determined because of sound quality. 24 is still used today for movies for cinematic effect.

A lot of earlier movies were 16 fps which sounds like what would have been determined as the "minimum" but it's 1000 fpm and 1000 was a nice even number. This minimum fps of what we call the "illusion of motion" is what's debatable because different people will have different opinions on how much stutter is too much. How we perceive motion is also affected by how bright the room is or even by how a person is feeling so different situations can affect how we perceive things even for the same person. But the main thing to consider is that we see a CONSTANT stream of visual info.
 


Fair enough, I can see my mistakes in what I had said. It had been originally presented to me a few years ago that 30FPS was the max limit of what we can see and I never really read into it more so I had some cut corners in my understanding of refresh rate and perceivable motion.
 

CatsCS

Commendable
Jul 13, 2016
2
0
1,510


I can tell the difference. I have dual monitor setup one is 144 and one is 60. I can move my mouse on my 60 and its more jumpy than the 144. Requires a brain and eyes.