TV (used as external monitor) displaying "Unsupported Mode" on Native Resolution and Refresh Rate via HDMI

llamanaro

Honorable
Feb 23, 2013
10
0
10,510
Hello, all! Firstly, my graphics card is an HD Radeon 6850, and my television is an Element Electronics FLX 1911B.

It is SUPPOSED to be a 1440x900 60Hz display, however when I connect my computer to it via HDMI, I will either get a black screen or a blue screen (NOT a bsod) saying "Unsupported Mode."

The TV works fine as a second monitor at 1024x768 resolution, however I would like to use the television to it's fullest potential as such. I have tried all the different modes under the "List All Modes" area in the display settings, but the native resolution/refresh rate (which is also the "Recommended" setting in Windows) will either be a black screen or an "Unsupported Mode." I tested it with a VGA cable on a laptop, and the TV can display at 1440x960 60Hz through VGA with no problems, but I would prefer to use HDMI, since digital is more precise than analog.

Note that SOMETIMES instead of saying "Unsupported Mode," the TV will simply say "No Signal!" on a black screen. Usually it says "No Signal!" on a BLUE screen, and the only time it is black is a quick "flash" when it detects a source being plugged in. This leads me to suspect that the TV is trying to hear the signal, but can't.

Please help. Thank you. I'm A+ certified, but haven't been able to figure this out.

SPECS: I am running Windows 8 on a custom built computer. The graphics card is an HD Raedeon 6850, and it has four outputs: Both variations of DVI, one HDMI (Which I'm connecting to the second monitor with), and a DisplayPort connection. I have an active DisplayPort to DVI adapter coming in the mail to install a third monitor in the future, and a DVI to VGA cable to resort to using VGA if needed.
 
The only thing I could think of would be if the overscan/underscan settings are messing with the TV. over/underscan is not something that happens with VGA or even DVI, but tends to be used on HDMI since some TV's overscan the image and this setting lets you correct this.

Strange thing is that over/underscan should not actually alter the resolution and just scale the image to allow you to either remove black bars around it of shrink the image for it to all fit on screen.

1440 x 900(it is in fact a 16:10 resolution, something i have NEVER seen on a TV they all tend to be 16:9) is a very strange resolution for a tv to be honest. I wonder if it actually wants 720p(1366 x 768 for most tvs) or 1080p(1920 x 1080. long shot and scaling would make it look BAD) to scale with its built in scaling engine.

As long as you have your main screen on, you can play with resolution settings and always recover :)

Worst case, on a 19inch screen, I do not thing VGA will actually look that bad at all. It just will not be able to play bluray :(
 

llamanaro

Honorable
Feb 23, 2013
10
0
10,510
Somehow I accidentally edited my previous post instead of adding a new post, so I'll just copy it here and delete the previous post:


It does work, but with borders. Before, and even after, being rescaled, the display looks horrible. I don't know how to describe it, but a REALLY lose analogy is to imagine as if everything had been painted with oil paint. Still, the TV does see 1280x720. You said that 1440x900 is a 16:10, and I did the math and it definitely is, but upon checking the product page (I just googled the model number and looked at the TigerDirect link) the TV is marked at 16:9. I wonder if this means that it will only show 16:9, yet it physically 16:10?
 
When the TV was made, they may have had 1440 x 900 panels available and cheap and chose to go that path.

Chances are the HDMI in is simply not not designed to run that resolution and just the HDTV ones. While the scaling may look bad on a computer, It may not look too bad on videos like those a bluray or settop box would put out.

Looks like your best option is VGA for that screen.
 
I do not think vga looks that bad.

On the screen I am using to type this(SyncMaster 245T), VGA looks just slightly less sharp compared to DVI. Everything else(games and images) would be hard to tell the difference. This is with the stock VGA cable and a DVI -> VGA adapter on the video card.