Sign in with
Sign up | Sign in
Your question

Graphics card detected, but no display. Many hours spent trying to solve this problem

Last response: in Graphics & Displays
Share
June 24, 2013 10:59:27 AM

Hey Tom's Hardware,

Specs are:

Custom Built
Windows 7 64 bit
i5 3570k
2x 4gb Corsair 1600Mhz ram
Corsair HX650w Modular PSU
Hyper 212
AS Rock z77 Extreme4
Sapphire 2gb 7850HD

The situation:

I recently took apart and re-built my computer into a new case. Before the re-build everything was working fine. After the re-build I couldn't get any display output from my GPU.

I have tried:

GPU and Display drivers:
- Updating/reinstalling GPU drivers following this method:
http://www.overclock.net/t/988215/how-to-properly-unins...
- I have taken the GPU to Memory Express, and it worked in one of their test PCs.

PSU:
- Thought it might be the modular cable going from the PSU to the GPU, so I got Corsair to send me a new PCI-E x1 (6 pin) male to x2 (6 + 2 pin) male cable.

BIOS:
- Resetting CMOS
- Updating BIOS
- Changing Primary video output to PCI-Express
- Disabling onboard video. (Apparently this is called disabling multi display in this motherboards BIOS settings under "North Bridge")

Motherboard:
- Trying a different PCI-E slot
- Currently have the computer built outside of the case incase of any shorts

Details:
If I have the primary video output set to PCI-E, and the monitor plugged into the GPU, there is no display. Monitor goes into power saving mode.

If I have the primary video output set to Onboard, and the monitor plugged into the Motherboard, there is display. During this, if I go into the BIOS, and have the GPU powered by the 6pin from the PSU, there IS a GPU detected in the BIOS system browser. When the computer starts and Windows is running, if I go to Device Manager -> Display Adapters, it does show a Radeon 7850HD, as well as Standard VGA Graphics Adapter.

If I have the same as above, except unplug the power from the GPU, there is no GPU detected in the BIOS, and there is nothing other than Standard VGA Adapter under Display adapters.

Summary:
No display from GPU. Onboard display output works fine. Have tried GPU in different computer, and it works.

I have spent many hours trying to solve this. Have looked through plenty of forums, and still no luck. I am running out of ideas. Any ideas or help would be greatly appreciated!

I will respond to questions quickly.

Thank you.

a b U Graphics card
June 24, 2013 11:35:14 AM

Sounds like your motherboard may have been damaged in the move. Make sure you have all the stand off pegs in place in case its causing a short somewhere. Thats really all I can think of, you know the card works and you know your monitor works so it has to be the motherboard in my opinion. Unless you can test another card on that board and discount my theory.
m
0
l
June 24, 2013 11:38:50 AM

burdenbound said:
Sounds like your motherboard may have been damaged in the move. Make sure you have all the stand off pegs in place in case its causing a short somewhere. Thats really all I can think of, you know the card works and you know your monitor works so it has to be the motherboard in my opinion. Unless you can test another card on that board and discount my theory.


I currently have the computer built outside of the case, so a short caused by standoffs can't be the problem. I also think it might be the motherboard, but unfortunately have no other card to test with it.

Could it be the motherboard even if the card is technically being detected by the system?
m
0
l
Related resources
a b U Graphics card
June 24, 2013 11:55:20 AM

Could be your Monitor frequency out of sync with your card?
m
0
l
June 24, 2013 12:27:42 PM

vinhn said:
Could be your Monitor frequency out of sync with your card?


This could be.. I did switch to a 120Hz monitor after previously using 60Hz. How do I find out if this is the issue?
m
0
l
a b U Graphics card
June 24, 2013 12:36:08 PM

ijova said:
vinhn said:
Could be your Monitor frequency out of sync with your card?


This could be.. I did switch to a 120Hz monitor after previously using 60Hz. How do I find out if this is the issue?


Plug in your old monitor?
m
0
l
June 24, 2013 1:06:39 PM

burdenbound said:
ijova said:
vinhn said:
Could be your Monitor frequency out of sync with your card?


This could be.. I did switch to a 120Hz monitor after previously using 60Hz. How do I find out if this is the issue?


Plug in your old monitor?


I can't believe I didn't try this earlier.. I plugged in another older monitor since I don't have my old one around, and the display worked!

One problem though.. The monitor I want to use is a Samsung SyncMaster P2270.
It comes with a DVI-A to D-sub cable. Looks like: http://imgur.com/Elo51FF
My GPU does not have VGA input, so I tried connecting the monitor with this cable, and using a VGA to DVI adaptor. It worked. However, the max resolution is 1680x1050. The monitor is capable of 1920x1080.

If I use a regular DVI to DVI cable (http://i.imgur.com/ctiMpyq.jpg), I get no display.

Does anyone know why there would be no display with that DVI to DVI cable?

We're making progress! Thanks to everyone so far.

Update: I went to Device Manager -> Display Adapters -> Details Tab, and it instantly changed the resolution to 1920x1080.. I didn't even press anything.. Interesting.

Does anyone know if using this DVI-A to D-sub cable along with a VGA to DVI adapter is going to hinder performance in any way? With respect to response time, refresh rate, etc?

m
0
l
a b U Graphics card
June 24, 2013 2:14:04 PM

ijova said:
burdenbound said:
ijova said:
vinhn said:
Could be your Monitor frequency out of sync with your card?


This could be.. I did switch to a 120Hz monitor after previously using 60Hz. How do I find out if this is the issue?


Plug in your old monitor?


I can't believe I didn't try this earlier.. I plugged in another older monitor since I don't have my old one around, and the display worked!

One problem though.. The monitor I want to use is a Samsung SyncMaster P2270.
It comes with a DVI-A to D-sub cable. Looks like: http://imgur.com/Elo51FF
My GPU does not have VGA input, so I tried connecting the monitor with this cable, and using a VGA to DVI adaptor. It worked. However, the max resolution is 1680x1050. The monitor is capable of 1920x1080.

If I use a regular DVI to DVI cable (http://i.imgur.com/ctiMpyq.jpg), I get no display.

Does anyone know why there would be no display with that DVI to DVI cable?

We're making progress! Thanks to everyone so far.

Update: I went to Device Manager -> Display Adapters -> Details Tab, and it instantly changed the resolution to 1920x1080.. I didn't even press anything.. Interesting.

Does anyone know if using this DVI-A to D-sub cable along with a VGA to DVI adapter is going to hinder performance in any way? With respect to response time, refresh rate, etc?



Have you tried a different DVI cable? The cable you are using might be malfunctioning, make sure it's the correct type for that monitor also.
m
0
l
!