okay guys, i have this really wierd problem here.
built my pc around 3 months ago, not a very good one, but works.
using a 6200 AGP, and my power supply used to be a powerman 350W that came with the case, well i changed it to a modstream 450W for future upgrade, but after that the DVI output stopped working for resolutions under 1024. 1024 and 1280 works fine. i am able to get 640 and 800 to work a few times, but most of the time its just black screen with "no signal" sign.
since i have a LCD with both DVI and VGA, i hooked both cable from my graphics card to the LCD. well VGA works fine, but DVI still doesnt work. there's a slight lag or loose of frame in the dvi connection as well.
right now i have my vid card running "clone" mode, and priority is DVI, so whenever i play games the LCD switches to VGA mode.
anyone have the same problem here? is it the cable, or does it have something to do with the power supply. the power's a lot cleaner now for sure, but might "amp overflow" be the problem? or is it just my cable or LCD?
built my pc around 3 months ago, not a very good one, but works.
using a 6200 AGP, and my power supply used to be a powerman 350W that came with the case, well i changed it to a modstream 450W for future upgrade, but after that the DVI output stopped working for resolutions under 1024. 1024 and 1280 works fine. i am able to get 640 and 800 to work a few times, but most of the time its just black screen with "no signal" sign.
since i have a LCD with both DVI and VGA, i hooked both cable from my graphics card to the LCD. well VGA works fine, but DVI still doesnt work. there's a slight lag or loose of frame in the dvi connection as well.
right now i have my vid card running "clone" mode, and priority is DVI, so whenever i play games the LCD switches to VGA mode.
anyone have the same problem here? is it the cable, or does it have something to do with the power supply. the power's a lot cleaner now for sure, but might "amp overflow" be the problem? or is it just my cable or LCD?