Input not supported

Christian Sison

Honorable
Aug 22, 2013
7
0
10,510
I just put in a video card in my rig, it's a GT 220
I plugged it in our 32' monitor via HDMI and it worked fine. Then I tried a dual output utilizing the HDMI in our 32' monitor and vga output in our old 17' monitor (we bought it back way back 2007)


Here's what happened
During the extended view both monitors working I changed the resolution of our 32' monitor to 1366x768 and our 17' monitor to 1280x1024.


Then I tried rebooting and used the 17' monitor as the only output device and it said INPUT NOT SUPPORTED. So why did it occur? When in the extended view it was working fine. What can I do to fix it?
 
Tv`s, and monitors do not work on the same frequency range.

If the screen displays the message INPUT NOT SUPPORTED.
Just before windows loads, it is because the screen output is too high or the refresh rate is set wrong in the video driver of windows.

To recover it you would have to load windows in safe mode and set the video settings back to the known max working resolution of the 17" monitor.
After a re start and boot into normal windows mode it should be fine.

I think it depends on what the first monitor resolution is set to, and does not do a refresh properly.
 

Christian Sison

Honorable
Aug 22, 2013
7
0
10,510
This is weird but I'm kinda answering myself -.- I hope this helps some of the passersby and are having the same problem
I have tried and came up of a solution myself the countless hours of tweaking the system lead to this
If you're an Nvidia gpu user do this
1. Open Nvidia Control Panel
2. Adjust desktop size and position
3. Change the "Perform Scaling On" from display to GPU and it made it work yeah
Well that's atleast if that's the solution if it's saying Input Not supported and the source is the gpu