Comp to diplays with TV coax out - quality over distance?
I want to run a coax tv out card from a computer and have it split to two monitors, one 125 feet away and the other 50 feet away. I have been told by one person they thought I would need an amplifier that would cost over $1000.00. Any suggestions?
Thanks in advance!
Thanks in advance!
Just to be clear, I am going to be running a powerpoint presentation or some other scheduleing software (this will be showing the schedule of events in each of our arena's along with some still advertising) that I want to output to two 42" lcd tv's, one 50-75 feet away and the other around 150 feet away. It will not be just a tv signal. I was told by Best Buy that I will lose too much signal over that length and my picture could be compromised. I have found co-ax amps but they do not specify if they will work with the same db signal that the computer puts out.
I don't mean to question you, just wanted to be clear what it was I am trying to do. I am making a presentation to our board of directors (not for profit) and want to be sure I have all my ducks in a row.
Just what is the nature of the signal you have available coming out of the computer? Is is a modulated TV channel signal - like, on channel 3 or 4 - that is fed through typical Cable TV coax to the tuner input of a TV? That would be easy because the equipment is widely available. To do this I would suggest the simplest system would be three very short pieces of TV coax, one 2-from-1 signal splitter, two Cable TV amplifiers - one 10 dB, one 20 dB - and two long Cable TV coax pieces - one about 75 feet, one about 150 feet. For all those cable pieces, don't go with the cheaper RG59 cable type; if you can, get the better RG6 cable that has much reduced signal loss and better noise shielding. The splitter and amp, though, do NOT need to be high-quality units for satellite (bandwidth over 1 GHz) because all you're sending is a signal around the 60 to 70 MHz region.
From the video card TV Output connector run one short piece of cable to the splitter input. From each of its outputs, run a short piece to the inputs of the two amplifiers. Now run the shorter (75 foot) cable to the closer TV from the 10 dB amp's output; run the longer cable from the output of the more powerful 20 dB amp. What we're doing here is anticipating that the signal loss in the long cables will be around 10 dB per 100 feet. The splitter "costs" you 3 dB. So, for example, on the shorter run you have the video card's output reduced (at splitter) by 3 dB, then boosted by 10 dB, and then reduced along the 75 feet by somewhere in the 5 to 10 dB range. It arrives at the closer TV not much different from the video card's output. Similar argument on the longer run. The key is the amp on each leg is at the INPUT end of the cable - the signal is boosted before it is degrades in the long cable run.
By all means, set this up ahead of time and verify that you get good results at each TV. Be VERY aware, however, that a standard broadcast TV signal does NOT have the resolution of a computer monitor. It is, at best, comparable to the old VGA 640 x 480 resolution. That is just the nature of broadcast TV with a bandwidth of about 4 MHz. So do not plan to show any graphics with fine details or small text. TV is great, though, for things that move and don't need stationary fine detail.
If this is your setup, when you test it out, you COULD run into signals that are too strong if I've over-estimated the amplifier power needed. A weak signal at a TV will have random spots of wrong colors or white "snow" in it. A too-strong signal will have almost none of this, but may show distortions on part of the picture. In severe cases, I've seen a REALLY too strong signal actually show up as a terribly snowy picture. If you suspect this is a problem, simply remove the amplifier and its short input cable on the shorter line, and feed that TV by just plugging its cable directly into the splitter. If you get a better picture that way, leave it. Then look at the other TV. If it has a similar problem, try substituting the 10 dB amp in its feed instead of the 20 dB amp. Do whichever gives you the best result.
OK, but suppose you want much better resolution than that on your TV's. THAT is when you get into more expensive (much harder to find) cables and amps. To get computer monitor resolutions like 1280 x 960, or current wide-screen high res like 1680 x 960, you have to start with signals like that from the video card - in this case, two signals. So you either need a video card with dual digital outputs, or some fancy digital signal amplifier / duplicator. Then somehow you need to amplify those signals and send them down a very long expensive multi-conductor cable. This is definitely the area of computer graphics specialists and conference facility organizers. I expect you do NOT plan to do this, but just realize you won't get that high-quality performance with TV cables.
The intermediate possibility is to use the Composite Video output of some video cards along with the Stereo Audio output. This is the familiar cable with three lines and RCA connectors on the ends - Yellow for Composite Video, White for Left Audio, and Red for Right Audio. Each of these three cable pieces also is a type of coaxial able, but different from the RG6 75-ohm cable used for Cable TV. This system is NOT designed for very long cable runs so you'll never find a 3-cable setup 150 feet long! I only mention it because Composite Video has better bandwidth than broadcast TV, so the resolution you get on a decent TV this way is more like SVGA's 1024 x 768, maybe slightly better. If you judge that the TV Cable signal won't be good enough, ask around for advice on whether you could make long cable runs with Composite Video and Stereo Audio. Maybe some group specializing in conferences or concerts has a way to help, but it won't be cheap. However, using their expertise and renting equipment would be better than buying it and trying to go this route all by yourself.