Hello, all! Firstly, my graphics card is an HD Radeon 6850, and my television is an Element Electronics FLX 1911B.
It is SUPPOSED to be a 1440x900 60Hz display, however when I connect my computer to it via HDMI, I will either get a black screen or a blue screen (NOT a bsod) saying "Unsupported Mode."
The TV works fine as a second monitor at 1024x768 resolution, however I would like to use the television to it's fullest potential as such. I have tried all the different modes under the "List All Modes" area in the display settings, but the native resolution/refresh rate (which is also the "Recommended" setting in Windows) will either be a black screen or an "Unsupported Mode." I tested it with a VGA cable on a laptop, and the TV can display at 1440x960 60Hz through VGA with no problems, but I would prefer to use HDMI, since digital is more precise than analog.
Note that SOMETIMES instead of saying "Unsupported Mode," the TV will simply say "No Signal!" on a black screen. Usually it says "No Signal!" on a BLUE screen, and the only time it is black is a quick "flash" when it detects a source being plugged in. This leads me to suspect that the TV is trying to hear the signal, but can't.
Please help. Thank you. I'm A+ certified, but haven't been able to figure this out.
SPECS: I am running Windows 8 on a custom built computer. The graphics card is an HD Raedeon 6850, and it has four outputs: Both variations of DVI, one HDMI (Which I'm connecting to the second monitor with), and a DisplayPort connection. I have an active DisplayPort to DVI adapter coming in the mail to install a third monitor in the future, and a DVI to VGA cable to resort to using VGA if needed.
It is SUPPOSED to be a 1440x900 60Hz display, however when I connect my computer to it via HDMI, I will either get a black screen or a blue screen (NOT a bsod) saying "Unsupported Mode."
The TV works fine as a second monitor at 1024x768 resolution, however I would like to use the television to it's fullest potential as such. I have tried all the different modes under the "List All Modes" area in the display settings, but the native resolution/refresh rate (which is also the "Recommended" setting in Windows) will either be a black screen or an "Unsupported Mode." I tested it with a VGA cable on a laptop, and the TV can display at 1440x960 60Hz through VGA with no problems, but I would prefer to use HDMI, since digital is more precise than analog.
Note that SOMETIMES instead of saying "Unsupported Mode," the TV will simply say "No Signal!" on a black screen. Usually it says "No Signal!" on a BLUE screen, and the only time it is black is a quick "flash" when it detects a source being plugged in. This leads me to suspect that the TV is trying to hear the signal, but can't.
Please help. Thank you. I'm A+ certified, but haven't been able to figure this out.
SPECS: I am running Windows 8 on a custom built computer. The graphics card is an HD Raedeon 6850, and it has four outputs: Both variations of DVI, one HDMI (Which I'm connecting to the second monitor with), and a DisplayPort connection. I have an active DisplayPort to DVI adapter coming in the mail to install a third monitor in the future, and a DVI to VGA cable to resort to using VGA if needed.