I know that for a computer connection you can only convert signal one way without the use of a box. If the signal leaving the computer is digital it can be converted to analog without a problem.. But I am writing because I am wondering why...
What I learned in school was that analog signal was a wave, and a digital signal was basically just a stream of yes and no's denoted by either a "1 block (meaning yes)" or a "0 block (meaning no)".
They said that it would be really easy for a analog signal to be converted into a digital signal because you can just "round the wave signal over a specified time interval to create either a 1 block or a 0 block" which would make it digital. BUT, with that in mind, they said to create an analog signal from a digital signal you would have to synthetically create analog waves from the digital 1 & 0 blocks that the digital signal would provide.. They said that would be extremely difficult and would require a box to do so.
All of this info checks out if you look at local broadcast-ed TV for example. When you try to get the now digital broadcast-ed TV on your old analog TV, you have to get a converter box to take the digitally aired signal and create a synthetic analog signal for your old piece of equipment.
BUT with computers its the exact opposite for some reason!!! with a computer you can take the digitally created signal from your laptop video card, and make it into a analog monitor signal with the use of a 4 dollar cable and no converter box! I just don't get it, but any help would be greatly appreciated
Sorry for the lengthy and complicated post but its been driving me nuts for years and i cant ask my professor anymore :/
Thanks again!
What I learned in school was that analog signal was a wave, and a digital signal was basically just a stream of yes and no's denoted by either a "1 block (meaning yes)" or a "0 block (meaning no)".
They said that it would be really easy for a analog signal to be converted into a digital signal because you can just "round the wave signal over a specified time interval to create either a 1 block or a 0 block" which would make it digital. BUT, with that in mind, they said to create an analog signal from a digital signal you would have to synthetically create analog waves from the digital 1 & 0 blocks that the digital signal would provide.. They said that would be extremely difficult and would require a box to do so.
All of this info checks out if you look at local broadcast-ed TV for example. When you try to get the now digital broadcast-ed TV on your old analog TV, you have to get a converter box to take the digitally aired signal and create a synthetic analog signal for your old piece of equipment.
BUT with computers its the exact opposite for some reason!!! with a computer you can take the digitally created signal from your laptop video card, and make it into a analog monitor signal with the use of a 4 dollar cable and no converter box! I just don't get it, but any help would be greatly appreciated
Sorry for the lengthy and complicated post but its been driving me nuts for years and i cant ask my professor anymore :/
Thanks again!