Video Card or Monitor problem??

DavidBoBavid

Distinguished
Mar 30, 2006
2
0
18,510
Hi,

I have a question regarding my video card or my monitor.. I had a weird problem last night, and now I don't see anything on my screen when using my DVI cable. It's kind of a long post, but I'm a newb here so bare with me. :)

So first my machine:
P4 2.4Ghz
2gigs DDR ram
Asus VT99999999 (6800GT)
Gateway FPD2185W (21" widescreen monitor)

So I've been playing Oblivion with the res set down to 1280x1024 in order to spare some of the stress on my computer/card ,and I have my nvidia drivers set to centre the smaller rez on the screen, so there are some black bars on the sides, rather than stretch it to fit the full screen.

Last night, I went to play, and when I startd the game, and the initial options menu came up (the one with the play/ load data file buttons) I heard the sound windows makes when you connect a USB device. Seeing how I didn't connect anything, I just hit play and ignored it. When the game screen came up, with the Continue/Load menu, it was stretched to the full screen, rather than being centered (as I have been playing it like that since I got it last Tuesday). SoI figured something was screwy with the nvidia driver ,and maybe the option for scaling was switched for some reason.

When I leave the game, my desktop resoultion was messed up, and I figured it was possibly due to the game not exiting properly or something like that. So I go to the desktop properties to change the rez, and the options for my screen rez (1680x1050) aren't there, but there are other weird ones like 1600x1200. And then I hear that usb sound again, and then my screen turns itself off.

So I rebooted, thinking it was still a problem with the game or something. When the computer started up again, the screen went on for a moment, then turned itself off, like it would if the computer was still off.

So I freaked out and tried to connect the monitor to my laptop (Powerbook G4 12") using the DVI cable. The laptop detected the monitor correctly, but still nothing would show up.

Finally I tried connecting my monitor back to my pc using the VGA cable instead, and it worked. Hoping that it would be a problem with the cable, I tried plugging my dad's monitor into his computer using the DVI cable, and everything worked fine on his machine.

Since it was late, I didn't want to get into moving monitors or PCs around to test them on each other, but I'm concerned about what could be wrong. Could it be that the DVI port on my monitor broke? Or maybe that the DVI output on my video card is broken??

Any advice would be greatly appreciated.

David.
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
In my experience, and based on what you've tried, it is likely the monitor and not the video card. If it is video card, look for a video BIOS update (good luck finding one!), but be very careful to use only a bios designed for your specific video card.

Actually, try another DVI cable first. If that doesn't work, most likely it is your monitor. Also remember some are quirkey so try turning the monitor off then on again AFTER the pc is turned on, and see if a screen comes up. Also make sure the monitor is "looking" at the correct source, i.e., it likely has a button that manually alternates between VGA source and DVI as the source. Sometimes, for reasons unknown to me, digital flat panels get stuck into thinking they always use vga cables.
 

DavidBoBavid

Distinguished
Mar 30, 2006
2
0
18,510
Hi bourgeoisdude,

I tried connecting a different monitor to my pc with the DVI cable, and that one worked, so it definitely rules out my video card. I don't have another DVI cable to test with, however. But since the other monitor worked, it seems the cable is alright as well.

I tried the video input on the monitor, and that didn't work either. To be honest, when I first got the DVI cable, I didn't even know I would have had to do that, since everything worked without requireing me to do anything. :oops:

I called Gateway, and they told me to unplug the monitor (the power that is) and then try it again. I did and it still didn't work, so they figured the DVI chip might be broken as well.. so they're going to send me a replacement. :)

Anyway,
Thanks for the tips,
David
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
To be honest, when I first got the DVI cable, I didn't even know I would have had to do that, since everything worked without requireing me to do anything.

Well, it is supposed to autodetect the connection. SUPPOSED to. Only on rare occassions possibly like this one, it does not autodetect it correctly. So you're right, under normal working circumstances it SHOULDN'T require you to do anything :wink: