Input signal out of range.

Mr. Koaliti

Honorable
Nov 13, 2013
108
0
10,690
Getting this issue, not sure how to fix it. It's a fairly new monitor that my grandfather gave to me and it works great other then this one issue. I have Win10 - 64bit, with the R9 280X. I updated my drivers and that did not work. For some reason the issue stops when you mirror the display instead of extending it. It says the rec input is 1400X900 60Hz, changing it to this res or lower does not work. Can someone help me fix this issue?
 
Solution

It's in CCC (Catalyst Control Center), if you never used VSR, it must be the GPU scaling then. All the post-15.4 drivers automatically enable GPU Scaling as it is needed for VSR to work. In CCC, enable advanced settings and go to the display settings, one of the options has GPU scaling inside.

Mr. Koaliti

Honorable
Nov 13, 2013
108
0
10,690


The monitor is VGA whoever I am using a DVI-I adapter to use it. And can you tell me how to disable VSR? I have never used it before.
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460

It's in CCC (Catalyst Control Center), if you never used VSR, it must be the GPU scaling then. All the post-15.4 drivers automatically enable GPU Scaling as it is needed for VSR to work. In CCC, enable advanced settings and go to the display settings, one of the options has GPU scaling inside.
 
Solution

Mr. Koaliti

Honorable
Nov 13, 2013
108
0
10,690


Thank you so much, it worked.
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460
No problem! Glad to help!

But just to clarify for anyone else that stumbles upon this thread, what GPU Scaling does, is it upscales any non-native resolutions to the monitor's default resolution, then sends it through the video cable.
This results in a sharper and cleaner image compared to what would display if the monitor itself upscaled the image.

_______________________________________________________________

1024x768 Signal on a 1920x1080p Screen :

Without GPU Scaling :
Native 1024x768 -> GPU Output -> 1024x768 4:3 Signal -> VGA/DVI/HDMI Cable -> Monitor IC Circuit -> Transform into 1920x1080 16:9 Picture -> Sent on Screen [Quick and simple processing by the Monitor IC Circuit results in a stretched, blurry image.]

With GPU Scaling :
Native 1024x768 -> GPU Output -> GPU Transforms into 1920x1080 16:9 Signal, original information is kept, black borders added to keep aspect ratio -> VGA/DVI/HDMI Cable -> Monitor IC Circuit -> Sent on Screen [Good quality postprocessing by the GPU results in an correct and sharp image on the screen.]

_______________________________________________________________
BUT if the monitor's old and does not tell what resolution it can accept, the GPU automatically assumes 1080p (As its the most common resolution). Since the Monitor cannot display 1080p, it will warn "Out of bounds".

Same thing will happen if you use an Active HDMI/DVI/VGA signal converter of any kind!


The advantages of GPU scaling, is that the GPU can modify the resolution at its will. i.e. A 1080p monitor can "Display" a virtual 4K desktop if the GPU downscales the 4K output to 1080p. Same for resolutions that aren't the same aspect ratio. i.e. 4:3, 3:2 or 16:10, the GPU has all the freedom into manipulating resolutions while maintaining sharpness.

TL;DR
GPU scaling outputs an constant resolution, no matter what you choose. The GPU upscales/downscales the resolutions and outputs to the monitor. If the monitor does not tell the GPU the maximum resolution it can accept, the GPU may be wrong and output a resolution that is too high.