Joedirt1389

Distinguished
Oct 14, 2010
7
0
18,510
Hey everybody- Last night i put together my new desktop, every fan is running all the lights are lighting up and everything is working fine on the desktop itself- as far as i know atleast, i plugged in my mouse/keyboard/monitor in preperation to turning it on- i turned it on and mymonitor said No input signal check video cables, so i checked all of the wires i hooked into the back coming from the monitor and they were all fine and what not- so i readjusted a few and tightened them a little bit and tried it again, i got the same exact message... SO I called Hanns-G the maker of the monitor and they said i need to change my resolution and my issue would be solved, so i got another monitor and i tried that and nothing even showed up on that other monitor, so i said thats not it... As far as i know all the wires are in there securly and everything is in there tightly, wire wise- so what could be my issue(s)?... And can i use my TV inplace of my monitor to switch the resolution
 

Paperdoc

Polypheme
Ambassador
Generally you can NOT use a TV, unless it is a newer model that has inputs for computer-generated signals like VGA, HDMI, or DVI.

We need more details here. First of all, does your motherboard have built-in video? To help, tell us exactly the make and model numbers for:
motherboard
video (card if any)
monitor

IF your setup is a mobo with built-in video PLUS a video card inserted into the PCI slots, the normal default setting of the mobo BIOS is to use the built-in video and ignore the added video card. In that case you MUST plug the monitor into the video output connector on the mobo's back panel, and NOT into the video card's output connector. Then your monitor can display the signals being generated and allow you to enter the BIOS Setup screens to change which video output device is used.

No matter which source of signal you are feeding to your monitor, read the monitor's manual carefully. Some have two or three possible inputs. There could be VGA, HDMI, and/or DVI input connectors and cables. The monitor probably has a menu system from which you select which is in use. (Some will auto-detect which input port is receiving a signal, and some will not.) Then you need to know which type of signal you are feeding it from the computer. A lot of video display systems (built-in or add-on card) use a default setting of a VGA signal type, and you can't get a DVI signal until your OS is installed and a driver for your video system installed in Windows and set for that signal type.

So make sure you are using the correct output port for the initial default video signal being provided, and that you connect this to the monitor and set it to use that input port.