So I recently bought a used videocard from someone. A HD3650 AGP no less. I'm running an A8V Deluxe with an Acer AL1717 monitor, 2 gigs of memory. Old system but you gotta love the classics, right? The idea is to install windows 7 but atm I'm stuck with XP because my old videocard is not supported above XP.
So you'd think I could just plug in the new card, try it out in XP and if it works install windows 7. Unfortunately the gods of hardware had different plans and I got a black screen after rebooting my pc with the new videocard.
It wasn't just a black screen though. It was "no signal" which means that it could have been the cable. The thing is the new card has two DVI outputs and the old card has one DVI and one VGA output. And my Acer monitor only has a VGA input, for my old card I had always used a straight VGA cable.
So I had to use a DVI-to-VGA adapter for the new card. I tried out this adapter on my old card and still couldn't get my screen to work. So it seemed logical that the adapter was to blame. A busted DVI-VGA adapter? I guess it happens...
No of course that doesn't happen! I got a brandnew DVI to VGA cable and it still didn't work. So then I started looking for other answers..
So long story short: turns out my monitor can't handle any other videocard output than VGA. Something to do with the drivers. I found some articles about it on other sites but those are not for XP and require installing third party software. So I decided to post about it here, I figured maybe someone would find this interesting and would want to help. That way I'd have a very reliable source.
The official acer website only offers a driver for my monitor that runs on windows vista.
The problem with my monitor is that it won't display any refresh rates other than listed in the EDID settings. So I need to override those EDID settings and force the correct refresh rate otherwise my monitor will just keep saying it can't pick up a signal. Does anyone know anything about this? There are some other threads here that mention similar problems but they have no answers yet. Would be nice to fix this problem once and for all.
So you'd think I could just plug in the new card, try it out in XP and if it works install windows 7. Unfortunately the gods of hardware had different plans and I got a black screen after rebooting my pc with the new videocard.
It wasn't just a black screen though. It was "no signal" which means that it could have been the cable. The thing is the new card has two DVI outputs and the old card has one DVI and one VGA output. And my Acer monitor only has a VGA input, for my old card I had always used a straight VGA cable.
So I had to use a DVI-to-VGA adapter for the new card. I tried out this adapter on my old card and still couldn't get my screen to work. So it seemed logical that the adapter was to blame. A busted DVI-VGA adapter? I guess it happens...
No of course that doesn't happen! I got a brandnew DVI to VGA cable and it still didn't work. So then I started looking for other answers..
So long story short: turns out my monitor can't handle any other videocard output than VGA. Something to do with the drivers. I found some articles about it on other sites but those are not for XP and require installing third party software. So I decided to post about it here, I figured maybe someone would find this interesting and would want to help. That way I'd have a very reliable source.
The official acer website only offers a driver for my monitor that runs on windows vista.
The problem with my monitor is that it won't display any refresh rates other than listed in the EDID settings. So I need to override those EDID settings and force the correct refresh rate otherwise my monitor will just keep saying it can't pick up a signal. Does anyone know anything about this? There are some other threads here that mention similar problems but they have no answers yet. Would be nice to fix this problem once and for all.