A very strange problem - Monitor detects any resolution with a 1280 width as 1280x1024 and compresses with black bars!

griff1

Honorable
Mar 21, 2013
18
0
10,510
Hello,

I am encountering a very odd problem for which driver updates and so forth have not helped.

When my monitor (Dell S2209W, native 1920x1080) receives an input of 1280 width it always detects it as 1280x1024 (info from the on screen menu).
This means, if I want to downscale to say 720p (1280 x 720) for performance reasons the image is squashed with black bars top and bottom!

I have never seen something like this before and do not have the faintest idea what could be causing it.

Any ideas?

(All drivers are up to date, DVI connection)
 
Solution
[strike]Hi, you can try in CCC>My digital flat panels>Properties>Enable GPU scaling so that the GPU does the scaling, see how that works out.[/strike]

Oh it's nvidia.... try and enable the same thing in the nvidia control panel, but I don't know the steps for that

bv90andy

Distinguished
Apr 2, 2009
599
0
18,990
[strike]Hi, you can try in CCC>My digital flat panels>Properties>Enable GPU scaling so that the GPU does the scaling, see how that works out.[/strike]

Oh it's nvidia.... try and enable the same thing in the nvidia control panel, but I don't know the steps for that
 
Solution

griff1

Honorable
Mar 21, 2013
18
0
10,510


In the nVidia control panel, if I select the 720p option - so 1280 x 720 @ 59Hz - it displays correctly. If I then go down the list to PC resolutions and select 1280 x 720 @60Hz the monitor misdetects the input as 1280x1024 again.

However, if I create a custom resolution of 1280x720 @ 59Hz it still misdetects it, indicating the refresh rate is not the culprit.

edit: Manually forcing the scaling on the GPU to 'max screenf ill' provides a workaround. THe monitor still thinks it's got a 1280x1024 input, but the 1280 x 720 output fills the whole screen... sooooooo whatevs.

Cheers!