New HDTV won't detect any signal from my PC's DVI port

mh321

Honorable
Jun 11, 2013
70
0
10,630
Hello, I just bought a Sony 55W800C TV, and I cannot get it to recognize the signal of the computer that I use for movies and videos, which I will from now on call "Computer A".

Computer A's specs: Geforce FX5500 PCI (VGA, S-video, and DVI connectors on back), Pentium 4 @3Ghz, Windows XP.

My old TV, which broke, had a VGA connector on the back, and worked perfectly with Computer A. Computer A has a DVI connector, and is connected to the new TV's HDMI jack using the appropriate cable.

When selecting the TV's input to the HDMI jack that Computer A is connected to, the TV says "No signal". The computer correctly identifies the TV as a Sony (visible using another VGA monitor in dual screen mode), with the correct resolution the TV takes (1080p 1920x1080). The DVI output appears to be activated and outputting 1080p to the TV.

I have tried almost every resolution and monitor configuration in the nvidia settings, and I cannot get the TV to recognize any signal from Computer A, not even the BIOS start screen when starting up. As mentioned earlier, Computer A recognizes the TV.

I have tried a few other computers with the TV:

Computer B:
P4, XP, radeon graphics card with DVI and VGA out.
TV will not take any signal from Computer B's DVI out. I connected Computer B to a 1080p computer monitor with a HDMI input, and the monitor detected and displayed Computer B's output while booting perfectly.

Computer C:
Dell Inspiron Laptop with DVI and VGA out, running Windows 7.
TV detects and displays the DVI output of Computer C.

Computer D:
FX8350 running windows 7, AMD HD7900 Series GPU.
TV detects and displays the DVI output of Computer D.


I cannot understand why the TV won't recognize the DVI outputs of Computers A and B, when both are outputting the right resolution to the TV. I have no clue why the TV won't recognize Computer B's DVI output while a computer monitor does.

Any ideas?
Thanks
 
the older gpu use the older display tech. the card may not be putting the right signal out on the right pin cables or the output is weaker then the tv can see or the cable you have is blocking the signal to the tv. if you can try a better cable see if that works if not see if someone has a newer gpu you can use for testing.
 

mh321

Honorable
Jun 11, 2013
70
0
10,630
I used the exact same cable for all tests, it confuses me how it would block one signal but not another. I do not have another cable, or a newer PCI GPU than a FX5500 at home. Computer A has no PCie or AGP slot, it is all PCI. I think Computer B might have one Pcie slot, but the CPU and PSU may be too weak for the pcie cards I have.
Computer C is from about the same era as A and B (2004-2006).
Is it the old GPUs that are the problem then?
Thanks
 

mh321

Honorable
Jun 11, 2013
70
0
10,630
Is it true that HDCP only applies when using a source with HDCP? I thought HDCP only activates if the source is protected, and you try to connect it to a destination that doesn't have HDCP.
 
The HDCP handshake may end with the "unprotected content" result, but if the source - your graphics card - doesn't say anything, the TV may reject the connection. It'd not how it should work, but some devices do it this way. Look for a firmware update for the TV. It might solve this and other issues.