1440p or 4K

G

Guest

Guest
Hello Tomshardware,

I am getting ready to build my first gaming computer. I am choosing my parts, but saw 2 monitors priced exactly the same. I found reviews on the 1440p, but the 4K is new. Both are IPS and 60Hz. The 1440p is the Dell U2713HM for $639.00. It has received many reviews, mostly very positive. The newer one is the Dell P2715Q for $700.00. The latter is 4K. Both of these monitors look very good and look to satisfy my gaming needs, but which one is superior?
 
For superior visual quality the 4K displays are best but they need a high end SLI/Crossfire set up to run games at a reasonable frame rate. The 1440 display will have really good quality but will take less graphics power to play games at 60 FPS, which is what I personally demand to feel smooth.

So depending on what your Gaming system has in it for a GPU will dictate which one will be better for your needs. This being said, What GPU are you using and is it SLI/Crossfired?
 
G

Guest

Guest


1 GTX 970
 

chenw

Honorable
Actually, since the price between them is so close.... I would actually buy the 4k, then set the resolution at 1440p until GPU's become powerful enough to handle 4k, then crank it up.

It removes the dependency for AMD to come up with their DSR equivalent and for nVidia to continue supporting DSR
 
the issue with do that is that the Picture will not be as clear as it would be on a 1440 screen. This happens with all monitors that get run outside the recommended resolution. Running the recommended resolution gives the best clarity the monitor can possibly provide. Once the resolution is changed to either Higher or Lower the visual quality drops and fuzziness appears, even at a higher resolution.

OK, let me try to explain this a different way. A monitor that is designed to run at, lets use standard 1080p for this example, 1920 x 1080 is only capable of displaying 1920 pixels wide by 1080 pixels high. Now if the resolution is lowered the pixels get grouped together showing the same color to form a bigger block. When this happens some of the surfaces can become slightly fuzzy, albeit no where as bad as going over resolution. But what happens when you set the resolution higher than the 1920 x 1080 pixels is pixels get dropped out of the "box" (box meaning square, even though it is just one pixel now because there are no more pixels left) because the screen can only show just so many pixels. This causes fuzziness on all surfaces and picture quality suffers tremendously.

So if your machine can not handle the 4K, because you are only running 1 GPU, and you drop the resolution your picture quality will suffer. If you use and sort of DSR the picture will not be as clear as if it were on the proper monitor.

If you are looking for a nice clear picture go for the 1440 or get a second GPU and go for the 4K resolution.
 


this exactly wrong the mismatch of pixels will create a fat blurry picture as explained by our friend in the above post.
 

chenw

Honorable
Ok, maybe I am mistaken on this, but when I use DSR on my 1440p to 4k resolution, I did not need to use DSR smoothing at all, even though it is an uneven ratio of pixel sizes (unlike going from 1440p to 1080p for example), so I had assumed that it was because of the higher PPI negating that problem, hence why I thought buying 4k screen would have been the better idea, I came up with the idea that if 4k works wonders on a 1440p screen like that, then surely, a 4k screen with even higher PPI would at least have similar effect.

If that is not actually the case, I stand corrected, but... it would then seem that the 1440p to 4k conversion would be pinned on DSR feature (so either nVidia would have to keep supporting it and/or AMD comes up with their equivalent version), using 4k and downing it to 1440p doesn't depend on that at all, hence why I thought it would be better to remove that kind of dependency.

Now, if it was a 1440p 144hz vs a 4k 60hz, then it's a diff story, but it's outside the scope of this thread :p
 

wh3resmycar

Distinguished


using a gtx970 first hand and DSR isn't doing what you're saying here. probably when running custom reso's on AMD hardware. the issue that you're stating happened to me when i was using custom reso's but with DSR the blurry texture are close nothing at all (smoothing slider @ default 33%).

on second thought since the price difference is too close like what chenw said, i think i'd go with the 4k monitor too.
 

chenw

Honorable
33% default is a little too high to start out, I used to use about 15% for 1440p reso on 1080p screen, and it was much better (couldn't see any blurring).

However DSR'ed 4k on 1440p monitor (Swift) I didn't even need to use DSR smoothness (granted I didn't take a very close look, but it was definitely nowhere near as blurry as 1440p on 1080p).
 

phatazzbp

Reputable
Jun 6, 2014
26
0
4,540
I had this same question a few weeks ago and opted for the 4k. I bought the Samsung UD590 28 inch 4k monitor that is on a black Friday sale for 479 right now. I have an EVGA GTX 970 SC and so far it handles 4k very well. Games that aren't TOO graphically demanding such as League of Legends, Sims 4, and Tales from the Borderlands run 4k just fine on Ultra settings. Tried watch dogs with AA turned off on Ultra and was still able to run it, just not at 60 FPS. I went down to 1440p and the picture looks just fine, not blurry at all. I plan on adding another 970 in SLI to fully utilize the 4k, but as of right not the 4k monitor is running beautifully on a single GPU.
 


This happens with a custom res on Nvidia GPU's without running a DSR program, just to reiterate what you just said, so another process, if not more than one, needs to run so you can upscale or down scale your image causing more CPU usage. This can effect the performance of the PC in some situations where a high CPU usage is needed. This intern can effect the GPU performance so there is always a trade off. You just need to decide which way will work for you.