why i can put more resolution than native?

ppdemo

Distinguished
May 15, 2013
129
0
18,690
hello like the tittle said i have a 1920x1080p native resolution monitor and r9 290x graphic card and in Windows display resolution i can put till 3200x1800 resolution...
i dont understand how resolution Works, depend it monitor or graphic card?
and what happen when i use 3200x1080 resolution, is posible that my monitor Works in this resolutions? beause it support only 1920x1080 native resolution and i never see before that Windows display resolution show me more resolution than supported by the monitor...i hope your answers regards
 
Solution
What the card is doing is drawing the frame out in its memory at the 3200x1800 resolution, and then down-sampling it to 1920x1080 to reduce the jaggies on the screen. It is the fastest way to do that. It just makes the image you see on the screen look smoother.

At least on my system, I noticed some slowdown from this, and set the resolution back to the normal resolution. But many people do like it.
What the card is doing is drawing the frame out in its memory at the 3200x1800 resolution, and then down-sampling it to 1920x1080 to reduce the jaggies on the screen. It is the fastest way to do that. It just makes the image you see on the screen look smoother.

At least on my system, I noticed some slowdown from this, and set the resolution back to the normal resolution. But many people do like it.
 
Solution

exroofer

Distinguished
The latest AMD driver enabled this feature across a broad range of cards, which MarkW described in action above. In a lot of my games, which don't strain my 290x a whole lot, setting the game resolution to above 1080p (my monitor), it can give me a sharper image without actually having a higher resolution monitor.

Other games that make the gpu work real hard, like say, Star Citizen, doing this will kill the fps to below what I want.

It's a nice choice to have, and picking something like HD resolution to run on my 1080p monitor is a noticeable improvement without excessive fps drop.

This is also called downsampling I believe.
 

kyllien

Honorable
Jan 22, 2013
530
0
11,160
Native resolution is the maximum resolution you can set the monitor too. Overriding the native resolution on the display can damage it. This would be if you are using Windows Display properties to set the resolution. If you want to display a higher resolution you would really need to buy a display that supports that higher resolution.

If AMD has a downsampling option it would be controlled in the driver or AMD application and would be labeled differently then screen resolution.

On the NVidia side it is DSR or Dynamic Super Resolution for me I can set individual games to 3620x2036 DSR which is then downsampled to 2560x1440. My screen resolution in Windows is 2560x1440. My screen resolution in the NVidia driver can also be set to 3620x2036 DSR
 

exroofer

Distinguished
Using the graphics card software to do this will not damage your monitor, since the display is still actually it's native resolution.
Forcing the monitor to a higher resolution is a different thing, and should be avoided.

It is only certain games that you will want to do this, so play around with it with Fraps running for fps if the game does not have a built in fps monitor, and see if you like the result or not.
I would not enable it for your actual desktop, since there would be no point in doing so. It's more of a " I like how it makes game x,y, or z look" thing.
 
The only place the higher resolution exists is in your video card memory. It is down-sampled to the resolution of your monitor there as well, and then that is sent to the monitor. The monitor never knows this is happening. Only your video card knows.