Does Downsampling have the same performance impact than getting a new monitor?

skychaser30

Distinguished
Jun 9, 2011
25
0
18,530
As the title says, I've been using my 768p monitor since 2010 (still works like a charm). I'm just downsampling it to 1440p to get better visuals, and to not let my CPU bottleneck my GTX 1070.

Would it have the same performance impact, or would it be better to buy a new 1440p monitor?
 
Solution
Why yes, downsampling using nVidia's Dynamic Super Resolution (DSR) should indeed exactly simulate a higher resolution load on your GPU so you can decide if games would be playable there on a new monitor. As has been said the 1070 is powerful enough that you shouldn't notice much of a performance difference. The problem with actually using DSR is the antialiasing isn't very good--sadly it's just ordered grid (OGSS) which averages every pixel. It blurs things just slightly but the real problem is nearly vertical or horizontal lines are only poorly antialiased because the average is only one middle shade.

So how do you make it look much better and put a much higher load on your GPU (which would otherwise be wasted in leaving...
It would be better to get a new monitor.

Currently with your down-scaling you lose about 2/3 of the resolution, with the bonus of what i can imagine is very little alaising.

With a 1070 you should be able to get the full 1440p resolution and turn anti-alaising on with a new 1440p monitor.

Assuming you have a decent cpu with at least 8 gigabytes of ram.
 

skychaser30

Distinguished
Jun 9, 2011
25
0
18,530


Here's the rest of my specs:

i5 4690k @ 4.5ghz
16GB 1600mhz DDR3
Gigabyte Z87X-OC
CM B500 VER2
500GB HDD
320GB HDD

 
I would imagine you seeing 1440p monitor in action for the first time would be like the first time some one saw Fantasia, but less scary.

I was trying to think of a good analogy but watching a 2560x1440 downscaled to 1366x768 is hard to beat lol.

I would actually recommend one of those 2560x1440 g-sync monitors, like a ASUS ROG SWIFT PG278Q.
 

skychaser30

Distinguished
Jun 9, 2011
25
0
18,530


Yup. I really love my current monitor, and didn't know that the lower the resolution, the higher the CPU usage. I just wondered.

Any cheap ones you can recommend? I'm on a tight budget.

 
At lower resolutions the gpu is overpowered and the bottleneck for frame rates is with the cpu.

At higher resolutions the bottleneck shifts to the graphics card.

The cpu is still working its butt off though at higher resolutions, but it isn't what is holding you back, assuming it isn't an atom or a dual core lol.
 
Why yes, downsampling using nVidia's Dynamic Super Resolution (DSR) should indeed exactly simulate a higher resolution load on your GPU so you can decide if games would be playable there on a new monitor. As has been said the 1070 is powerful enough that you shouldn't notice much of a performance difference. The problem with actually using DSR is the antialiasing isn't very good--sadly it's just ordered grid (OGSS) which averages every pixel. It blurs things just slightly but the real problem is nearly vertical or horizontal lines are only poorly antialiased because the average is only one middle shade.

So how do you make it look much better and put a much higher load on your GPU (which would otherwise be wasted in leaving performance unused on the table) at low resolution? Rotated grid (RGSS) looks perfect and is the way the old 3Dfx cards used to do it--unfortunately it's pretty difficult to get with nVidia, which seems to prefer to multisample everything (MSAA) and apply supersampling only to transparencies. You used to be able to use a third-party hack to enable nVidia's Sparse Grid Supersampling (SGSSAA) which looks much better than DSR and has less performance impact than full SSAA. I notice there's a hack to enable supersampling for VR applications but I'm not sure which kind that gets.

So crank up those settings until you see 100% load on the GPU, before you decide to get a new monitor.
 
Solution