DVI to VGA error (monitor problem related to EDID)

seriouslydude

Distinguished
Oct 6, 2009
14
0
18,520
So I recently bought a used videocard from someone. A HD3650 AGP no less. I'm running an A8V Deluxe with an Acer AL1717 monitor, 2 gigs of memory. Old system but you gotta love the classics, right? The idea is to install windows 7 but atm I'm stuck with XP because my old videocard is not supported above XP.

So you'd think I could just plug in the new card, try it out in XP and if it works install windows 7. Unfortunately the gods of hardware had different plans and I got a black screen after rebooting my pc with the new videocard.

It wasn't just a black screen though. It was "no signal" which means that it could have been the cable. The thing is the new card has two DVI outputs and the old card has one DVI and one VGA output. And my Acer monitor only has a VGA input, for my old card I had always used a straight VGA cable.

So I had to use a DVI-to-VGA adapter for the new card. I tried out this adapter on my old card and still couldn't get my screen to work. So it seemed logical that the adapter was to blame. A busted DVI-VGA adapter? I guess it happens...

No of course that doesn't happen! I got a brandnew DVI to VGA cable and it still didn't work. So then I started looking for other answers..

So long story short: turns out my monitor can't handle any other videocard output than VGA. Something to do with the drivers. I found some articles about it on other sites but those are not for XP and require installing third party software. So I decided to post about it here, I figured maybe someone would find this interesting and would want to help. That way I'd have a very reliable source.

The official acer website only offers a driver for my monitor that runs on windows vista.

The problem with my monitor is that it won't display any refresh rates other than listed in the EDID settings. So I need to override those EDID settings and force the correct refresh rate otherwise my monitor will just keep saying it can't pick up a signal. Does anyone know anything about this? There are some other threads here that mention similar problems but they have no answers yet. Would be nice to fix this problem once and for all.
 
I've never experienced any issues similar to that, and I've combined many old cards, motherboards and monitors with new ones.

Might be something else, not the monitor.

But it's old, used hardware. Don't expect to much. It's most likely the videocard can't communicate for whatever reason.
 

seriouslydude

Distinguished
Oct 6, 2009
14
0
18,520
It definitely isn't the card. Like I said the monitor doesn't pick up the DVI signal from my old videocard either. And my old videocard still works, I'm using it right now. Old card has DVI and VGA output. New card only has DVI output. I am certain it is a monitor issue. If you google "monitor won't display DVI VGA" or search Tom's Hardware for "EDID" you'll find lots of info about it. Unfortunately they all have their own personal stories and solutions so I need a solution for my story as it often involves installing third party software and messing around with factory settings.