DVI problem

mac master

Distinguished
Oct 30, 2007
6
0
18,510
I have a problem when I plug in the DVI cable from the computer to the monitor. The screen comes with a ‘no signal’ message. When I connect it through a DVI to VGA converter (from the computer) and connect it through VGA to the monitor it works fine. This all started when I left the computer on for about 8hours straight (which wasn’t the first time that I had left it), came home and when I switched on the monitor the screen did not come up. Below are my computer specs:

TFT 19” wide DVI and analogue (VGA) connectors
500Gb Hard disk
2gb Memory
Windows Vista OS
Graphic card Nvidia 8800GTX (which has 2 DVI outputs)
Mother broad Foxconn
550W PSU

I spoke to the guy from where I bought the computer and he tested every part for me (TFT, mother board, Graphics card, PSU and the DVI cable it self) and he did not find anything wrong with any of the parts them selves. I am really stuck with this one! I also checked if I had a stupid setting on the monitor which sets it to only receive input from the VGA connector. But I realized that after I set it to DVI mode and plugged in the DVI cable it still didn’t come up and when I put in the VGA cable in, it came up by it self, therefore it chooses where the input is and displays it.

If any one of you guy have encountered this problem please inform me because this computer is only about 2 months old and I hate to see it in this state!

Thanks in advance.
 

cyberjock

Distinguished
Aug 1, 2004
305
0
18,780
Little background on DVI. DVI is a digital signal transferred over wire(your cable) to an endpoint(your monitor). When a signal is sent over wire you have inductive losses that play a part in distorting your signal. This is reguardless of VGA, DVI, whatever. All signals degrade over cables.

So what makes DVI better? Simple. The endpoint device realizes these losses, and it takes the signal it receives to 'reconstruct' what the intended signal was and outputs that. This is good if your signal is good(good cabling and shielding) and/or your monitor has a good controller for reconstructing the original signal. Unfortunately if neither are present then the end result is no output.

Judging from what you wrote it sounds like you've tested everything and everything seems to work, yet when you put them all together they don't work. This sounds like the problem DVI brings to the table. I'd recommend you try a spare monitor(using all of your other components and on DVI) to see if you can get an output on another monitor. If you do then you are a victim of signal degradation. The only way to fix this is to get another DVI cable. I had a DVI cable that suddenly was swallowing my signal off my monitor after working fine for months. The computer was rebooted, and poof, when it was supposed to start windows all I got was 'no signal'. Replacing the DVI cable fixed it. Even now I find it so hard to believe, yet I saw the results with my own eyes.

So try another monitor with your equipment and if possible try another DVI cable. Quality does matter for DVI cables.
 

mac master

Distinguished
Oct 30, 2007
6
0
18,510
I really thank you for your fine reply, along with the background of DVI (I didnt know that:). But as i said above, i have tried with another TFT(also another make) and also with another DVI cable. I really am lost with this problem:(
 
Do you have another graphics card or a friend with one that has a DVI port? I beginning to wonder if the port on the card is bad. But then again that wouldn't make a whole lot of sense if the converter works (DVI-VGA). My you have found yourself a strange problem. Sorry I'm not much help. I though I was on to something and then I re-read your post more carefully.
 


Yeah, like I said, I re-read your post and realized my theory was debunked as I was typing it. I know how frustrating this must be though.
 

firemist

Distinguished
Oct 13, 2006
209
0
18,680
Connecting the monitor to the computer with a DVI-VGA converter is using the analog connection. Connecting the monitor to the computer with DVI could be either analog or digital depending on graphics board, cable, and monitor. The DVI spec also supports dual link.

A DVI connection can be one of three types - DVI-I, DVI-D or DVI-A. DVI-I:
DVI-I contains both the digital and analog connections, (DVI-D + DVI-A) , it's essentially a combination of DVI-D and DVI-A cables within one cable.

DVI-D:
DVI-D (like DFP or P&D-D (EVC)) is a digital only connection. If both devices being connected support a Digital DVI connection (DVI-I or DVI-D compatible) and are compatible in resolutions, refresh rates and sync, using a DVI-D cable will ensure that you are using a digital connection rather than an analog connection, without playing around with settings to assure this.

DVI-A:
DVI-A is really rare. Why use a DVI connector when you can use a cheaper VGA connector? see DVI-I P&D-A (EVC) is more common with projectors, and you should go to your projector manufacturer for recommendations.

Dual Link: Dual T.D.M.S. (transition minimized differential signaling) "links". DVI can have up to two TMDS links. Each link has three data channels for RGB information with a maximum bandwidth of 165 MHz, which is equal to 165 million pixels per second. Dual-link connections provide bandwidth for resolutions up to 2048 x 1536p.
Single Link: Single T.D.M.S. link. Each link has three data channels for RGB information with a maximum bandwidth of 165 MHz, which is equal to 165 million pixels per second.
Bandwidth for a single-link connection supports resolutions of over 1920 x 1080 at 60 Hz (HDTV).

The information from cyberjock is misleading. DVI is an improvement because it includes digital video in the specification (much better signal to noise ratio than analog). The statement "The endpoint device realizes these losses, and it takes the signal it receives to 'reconstruct' what the intended signal was and outputs that." is just wrong. The signal is either readable or not and the device has no idea what the intended signal is.

 

mac master

Distinguished
Oct 30, 2007
6
0
18,510
ok, thanks for the information but what good came out of this re my post? in my post i said that the computer system was working for about 1 month with the DVI cable but then it just didnt work any more. ie. compatibility issues are not the problem.
 

firemist

Distinguished
Oct 13, 2006
209
0
18,680
Did the guy who tested the components for you test both the analog and digital paths? There is not much involved here especially when it was working and now it is not. Either the graphics card, cable, or monitor do not work on the connection you are using. Now that you know there is more than one signal path to the monitor you need to identify which one it is using and test that one.

Were you there when the testing was done, and did he test all the paths?
 

mac master

Distinguished
Oct 30, 2007
6
0
18,510
i know it sounds simple but it so frustrating not knowing the cause! What he told me is that he tested from DVI(computer) to DVI(monitor) and (with a converter)VGA(computer) and VGA(monitor). He tested two monitors with both DVI and analogue connection on each. The cable was changed as well and even the graphics card.

Another thing is that I also tested it with my mac (which has a dvi connection) and it worked! so the monitor and the cable are fine and the graphic card was also changed. am i that unlucky that the tested graphics card was also faulty?? :s
 

firemist

Distinguished
Oct 13, 2006
209
0
18,680
Were all of your parts used in the test, the graphics card, cable, and monitor? If so were the components connected the same as you have them?

From your first post it sounds like the analogue path worked (VGA) but the digital path did not (DVI to DVI) and this path is worth rechecking. Are both DVI ports on the graphics card acting the same? Since the DVI connector has both digital and analogue present at the connector it can be confusing as to what set of pins are being used.
 

cyberjock

Distinguished
Aug 1, 2004
305
0
18,780


/cough

http://forum.ecoustics.com/bbs/messages/34579/125179.html

To quote the article:

When Signals Go Bad

One of the interesting distinctions between digital and analog signals is that they degrade in rather different ways. Both are electrical signals, carried by a stream of electrons in a wire, and so both are subject to alteration by the electrical characteristics of the cable and by the intrusion of outside electrical noise. But while the alteration of an analog waveform is progressive and continuous--the more noise is introduced, the more noise will come out of our speaker along with the tone--the digital signal suffers alteration quite differently.

First, a digital signal, because of its sharp transitions, is highly subject to degradation in its waveform; those sharp transitions are equivalent to a long--indeed, an infinite--series of harmonics of the base frequency, and the higher the frequency of the signal, the more transmission line effects, such as the characteristic impedance of the cable, and resulting signal reflections ("return loss") come into play. This means that while the signal may originate as a square wave, it never quite arrives as one. Depending on the characteristic impedance of the cable, the capacitance of the cable, and the impedance match between the source and load devices, the corners of the square wave will round off to a greater or lesser degree, and the "flat" portions of the wave will become uneven as well. This makes it harder for the receiving circuit to accurately identify the transitions and thereby clock the incoming signal (causing the phenomenon known as "jitter"). The more degradation in the signal, the harder it is for the receiving device to accurately measure the content of the bitstream.

Second, a digital signal, because of the way its information is contained, can be quite robust. While the signal will always degrade to some degree in the cable, if the receiving circuit can actually reconstitute the original bitstream, reception of the signal will be, in the end analysis, perfect. No matter how much jitter, how much rounding of the shoulders of the square wave, or how much noise, if the bitstream is accurately reconstituted at the receiving end, the result is as though there'd been no degradation of signal at all.

The result is that digital signals can be quite robust; they can exhibit no functional degradation at all up to a point. But the difference between perfect rendering of a digital signal and total loss of signal can be surprisingly small; one can reach a threshold where the digital signal begins to fall apart, and not long after that threshold, find that there is no signal at all. The signal which gets through flawlessly over several hundred feet may be unable to get through at all, even in a damaged condition, when the cable run is lengthened by another fifty feet.

How soon this threshold is reached depends a great deal upon the signal, and upon the tolerances of the cable in which it is run. The higher the bitrate, the more difficult it is to maintain reliable digital communication; the problem is that as the bitrate increases, the frequencies a cable must carry increase, and as frequency increases, the wavelength correspondingly decreases. The shorter the wavelength, the more likely it is that a cable of any given length, especially one close to a large fraction (1/4 wavelength is often considered a benchmark) of the wavelength will start to play a significant role in signal degradation. As this happens, the characteristic impedance of the cable becomes increasingly important. The degradation of the digital waveform will depend directly upon the impedance match between the source, the cable, and the load.

And just to clarify the DVI-A/D/I: I've always ended up coincidentally buying video cards with DVI-I. Monitors will choose the DVI-D first. As for cabling that's a different ballgame because I've seen different cables. I'm not sure if you could even buy a monitor with DVI-A. I'm not sure i've ever seen one.

The readability of the signal is whether the signal received IS able to be reconstructed.

readable = able to be reconstructed
not readable = not able to be reconstructed

They are one and the same, not two different entities like I think you are thinking.