Can't get output signal from DVI port on graphics card.

Sabathmandan

Reputable
Sep 11, 2014
3
0
4,510
I have a Sapphire r9 270x 4gb graphics card and a t.v that I use as a monitor. The card has 2x DVI outputs (DVI-D and DVI-I both Dual link) a display port and HDMI outputs and my monitor/T.V has VGA and DVI-I Dual link inputs. I'm running windows 7 64bit and my hardware meets all of the cards requirements. The problem I have though is when I connect my DVI cable I can see the bios and windows loading screen and can even boot up into safe mode but normal mode won't visually output after loading screen, I can hear my PC log in and it runs fine but it won't output anything to my monitor, however using the same DVI output port I can get display using the little DVI to VGA converter that came with the card. This is where I'm pulling my hair out because I have uninstalled all the drivers and software for the card and reinstalled the latest drivers/software with no luck I have tried previous driver versions and even beta drivers with no luck. I have plugged in my old graphics card without installing anything and that works completely normal using the same cable and same monitor. If I try booting windows in low resolution mode I get the same problem I have even remote logged into my computer and tried adjusting any setting I could find to try and get this darn thing to work I've looked into adjusting the EEPROM settings with no luck! I have been using this card for about a week over VGA and it runs fine any game or program even when it's running close to its limit I've had no problems it's just getting the bloody thing to work over DVI which I'm only trying to do because I want to run games and films in HD. I'm not sure wether it's the card itself or wether I'm missing something here because despite being a computer technician I cannot figure this one out! Any help would be much appreciated. Thanks in advance. Dan.
 

Wolfshadw

Titan
Moderator
"HD" or High Definition is only a product of resolution. It doesn't matter if the signal is analog (VGA) or digital (DVI).

As to your issue when you've remoted into the box, have you tried changing the resolution AND frequency? I know I had a lot of issues when I first tried to connect my HDTV to my HTPC. My HDTV's native resolution is 1366x768. I could only get a steady image when I had the graphics card resolution set to 1080i/30Mhz or something like that.

-Wolf sends
 

Sabathmandan

Reputable
Sep 11, 2014
3
0
4,510


Unfortunately yes, tried running it at every frequency and resolution it would allow me to choose from. It wouldn't bother me soo much but my T.V allows the DVI input to run at double the resolution the VGA will, and being a typical gamer nerd I would love it to run at its full potential. I also get interference quite badly over VGA which is another reason I'm trying to go digital, I run my Xbox over HDMI Component and that runs fine so I think it's just my VGA lead.Thanks for the quick reply though :)
 

Sabathmandan

Reputable
Sep 11, 2014
3
0
4,510
Update ... I think it's the drivers that's causing the problem, because I can start it up in safe mode with no problem, reinstalled windows 7 same problem, installed windows 8 (heartbreaking I know ) same problem, re-formatted to windows 8.1 with the same problem! The weird thing is that windows generic driver for handling graphics allows me to use the DVI-I Port but as soon as I install the drivers for my card witch unfortunately comes with AMD's Catalyst Control Centre that's where I start getting problems. I'm really stuck now lol I can't keep it in generic driver mode because of the limitations with hardware acceleration and direct x meaning I'm can't run games at high quality. I always had faith in AMD hated intel from the start but at least their cards and drivers work.