Radeon 4870HD + Dual Monitor problem

SparkS666

Distinguished
Mar 18, 2009
2
0
18,510
Hi, i'm having a huge problem, I've this card and two tv/monitors (using 2 dvi>hdmi cables), after installing ATI drivers, and setting the resolution to the tv max resolution (1650x1050), one monitor blanks the screen, lowering resolution to 1024x768, the image comes back. I've changed one dvi port to analog port and both monitors (one using hdmi and oher using analog cable) works. I don't know how to solve it, tried a lot of tutorials. Can anyone help? Thanks.
 

apedeaux

Distinguished
Mar 20, 2009
3
0
18,510
This may be completely off-base, but try unplugging the power cord from both your monitors for 30 seconds or so. I use dual 22'' Chimei monitors, and just recently one of them stopped getting a signal... it would display the memory check and POST info, then go blank (no signal) when the Windows loading screen would come up...

My monitor settings (I mean the hardware settings, the little menu buttons on the side of the monitor) have a place where you can set the input signal to analog or digital. Apparently my monitor had switched the setting from digital to analog, so nothing would show up through the digital (DVI) connection. Plugging my monitor's VGA port into the GPU (with a DVI-VGA adapter) allowed me to get the monitor to come up with an analog signal, so I could go to the menu and change the setting back to Digital. Unfortunately, the setting would not get saved or something, because if I tried to reboot and plug back into the DVI port, the monitor would still be set to analog...

Turns out the only way to get it to flip itself back to digital was to unplaug the monitor completely for 30 seconds or so. Plugged it back in through the DVI, and it worked like a charm. You don't even have to get it to come up with the analog signal first and try to change it, just unplug the sucker and then plug it back in and boot it up.

You may have a completely different problem, but give it a shot!
 

wh3resmycar

Distinguished
can you clear this out. so your tv/monitors both support 1680x1050? are they identical? or one is a tv, the other is a monitor?

because if the other dont support that resolution, maybe lower than that. that should explain why one of your panels switches back to 1024x720 (probably its max resolution).

what you can do first is make sure to set the other as an extension not a mirror (ccc >> displays manager >> displays properties)
 

SparkS666

Distinguished
Mar 18, 2009
2
0
18,510
Sorry, haven't written very cleary.

I have two tvs 22" (both identicals), they have resolution of 1680x1050. In CCC, one tv it detects as DTV (DVI) and the other one as DTV(HDMI), the DTV (DVI) works perfect on 1680x1050, but the DTV (HDMI) can only work on hdmi resolutions like 1920x1080 (full hd resolution) and other strange resolutions. When I force 1680x1050 the screen will become blank. I forgot to say that CCC detects that the tvs "maximum reported resolution" is 1920x1080 which is wrong. So, changing the DTV (HDMI) cable to a DVI>VGA cable, the resolution 1680x1050 works perfectly. One thing that I saw is that any resolution I set on DTV (HDMI) makes the image blurry, very ugly. Don't know why is losing so much quality, seems like watching an old movie.

Update:
I know now how to make both dvi>hdmi cables work great on my tvs. First I enter windows, it will detect a dvi and a hdmi connection. So in the hdmi tv, I change it's source to vga (losing the signal), then I force CCC to change the resolution of the second tv to 1680x1050, after that, CCC will change the second display to DVI (DVI), like the first display, then I change the tv source to hdmi again and the image appears perfectly like the first screen. I don't know why this happens, but it took me 4 days to figure it out.

Is there another way to force the DTV (HDMI) to act as DVI (DVI)? I don't like to have to do this manually every time I boot the system.


Thanks for the help.
 

TRENDING THREADS