How does 1600x900 resolution look differently on native FHD, QHD, or 4k screens?

organsofcorti

Reputable
Sep 30, 2015
1
0
4,510
Background: I'm buying a work 2-in-1 laptop (13-14"), and I have to run the primary program at lower than 1080 resolution. But almost all laptops that meet the minimum RAM/proc specs are at least 1080 - the ideal one is 1800.

I have to set the laptop at the lower resolution, because the "increase font sizes" option in my laptop's Windows settings doesn't carry over into the remote desktop application. Only the set resolution affects the text size of the program. And at 1080 on a 13-14" screen, the text is just a bit too uncomfortably small.

Question: I don't completely understand how running a higher-res screen (like 4k) at a lower resolution (like 1080) works. To begin with, it will look less clear than a native 1080 screen, right?

So if I need to run a program at approx 1600x900, what is the difference in how it will appear on screens with different native resolutions?

Will 900 look better on a QHD display than on a FHD (given that both are set in Windows to the 1600x900 resolution, and all other factors equal)? Or will they both give the same degree of blurriness?
 
Solution
You cannot do this on the desktop, or in applications. NVIDIA and AMD offer their own "technology" for this, if you will. NVIDIA = DSR (Dynamic Super Resolution), AMD = VSR (Virtual Super Resolution). Both of these work best in games, desktop use will be extremely uncomfortable with these.

Quick summary of what they do. They increase the resolution at which your GPU renders at, and so you'll be able to select a higher resolution than your native resolution in games. It works because while your GPU is going to render a higher resolution, it will then downscale it to match, and so while it does create a small blur if you really pay attention, it will most definitely make all games look worlds better. The downside to this is that the...
You cannot do this on the desktop, or in applications. NVIDIA and AMD offer their own "technology" for this, if you will. NVIDIA = DSR (Dynamic Super Resolution), AMD = VSR (Virtual Super Resolution). Both of these work best in games, desktop use will be extremely uncomfortable with these.

Quick summary of what they do. They increase the resolution at which your GPU renders at, and so you'll be able to select a higher resolution than your native resolution in games. It works because while your GPU is going to render a higher resolution, it will then downscale it to match, and so while it does create a small blur if you really pay attention, it will most definitely make all games look worlds better. The downside to this is that the resolution you select, will still be just as demanding as if you were playing on a higher resolution monitor, and then picking that resolution in-game, without the use of DSR or VSR.

Nothing will look better than matching the native resolution in pixels. Because 1) You can't digitally add physical pixels, 2) If you try this using a differente technique, things on your desktop (basically everything) will appear blurry, and is going to be hard to read, even if you enable scaling in Windows.

There is no way around this on the computer side. However, some TV's (usually higher end), has got extremely good scalers, and handle the upscaling in a way a PC user could only dream of. 900 will look better on a FHD than a QHD, because it's a lot closer to FHD than it is to QHD, but neither will beat 900p on a 900p screen. Losing 1:1 pixel mapping results in a blur. The big downside to LCD displays, again. Unless you have an expensive TV with a good scaler in it.




All the best!
 
Solution