Sign in with
Sign up | Sign in
Your question

Connecting PC to LCD TV

Last response: in Graphics & Displays
Share
September 4, 2007 7:04:22 PM

Hi Guys,

Im having a bit of trouble conencting my pc to my LCD tv
Im pretty new to this so bare with me

My tv has a hdmi connection for which i've got a hdmi - hdmi cable, and the graphics card on my pc has a dvi output, and i've got a dvi - hdmi adapter for that. I've read somewhere to be careful with what resolution you send down in case u break the tv, and was just unsure on how to setup the connection with resolutions, frequency etc.. and if there are any settings with the pc that i might need to change, like in control panel etc in order for it to show up on the tv and not my monitor.

Samsung LER74
Nvidia GE Force 7300 LE

Thanks

More about : connecting lcd

a b U Graphics card
September 4, 2007 7:44:08 PM

Your graphics card and TV should work together at the TV's native resolution, but that's not always the case. Set it at any resolution below the TV's native resolution and you won't hurt a thing, then connect the TV and try bringing it up to the native resolution.
September 14, 2007 1:12:53 PM


no luck... i cant seem to access the pc,hdmi, composite or s-video options on my tv menu, there all greyed out, supposedly because it cant find any source for it.
Not sure what else to do....
I've tried using a striaght vga connection but that didnt work either, should the tv autmatically pick up the source?
I checked it at the native resolution (1360x768)60hz but nothing was picked up, do u know if there is some option to 'add or detect a new display' sort of thing, or should'nt that be nesscary.


Related resources
September 14, 2007 2:03:05 PM

You don't have to worry about hurting your TV when it is connected to your computer via the HDMI connection. The TV will send EDID data to your computer telling it what resolutions it will work with and will not allow you to send any thing else. Please note that you have to make the connections, Turn on your TV, and then boot the computer in that order. If your TV doesn't show an image I would suspect that your DVI to HDMI cable adapter might be bad.
September 14, 2007 2:25:30 PM

thanks... i have got it working now..
yay!

Just one thing thou...
Even though i've got it set on the native resolution, the taskbar and some if the icons are missing, also the screen looks a bit fuzzled across certain lines... is there anything i could do to improve this..

Also since im using a dvi - hdmi connector.. will i be able to recieve sound from the tv
Cheers
September 14, 2007 2:36:18 PM

Congratz on getting your TV working. Your TV is overscanning and is normal when you are outputting HDTV. What I would try to do is send it a resolution that is the same as your TV's native resolution. The Fuzzed look is because your TV is scan converting from your incoming resolution to your native resolution of the TV.
September 14, 2007 2:39:02 PM

No sound is avalible from your video card. DVI doesn't support it.
a b U Graphics card
September 14, 2007 4:54:25 PM

A friend of mine switched from an nVidia graphics card to an ATI to resolve an overscanning issue, it worked for him. Overscanning is a bug.
September 14, 2007 9:34:46 PM

When a HDTV monitor sees a signal it considers a true HDTV signal it can not tell the difference between a output from a computer or a HDTV source such as a HDTV cable box or a BlueRay DVD player. Because of this it Overscans the image. For information on Overscanning see this http://en.wikipedia.org/wiki/Overscan. Overscanning by a HDTV monitor is normal and needed for a HDTV source, except for a computer source. For a computer you want to see every pixel. If you can resolve the issue with a ATI card that is fine but it is not a bug. Some monitors can be set to not overscan but most consumer TVs can't.
a b U Graphics card
September 14, 2007 10:03:14 PM

No, it's a bug. Overscan is only needed for analog sources and should never be used in the reception of digital sources. This is because digital signals are pixel-for-pixel while analog sources are not.

Overscan on a HDMI or DVI signal is a bug.
September 14, 2007 10:22:15 PM

TV's have overscan designed into them because the originating source of video might have been from a analog source. The other reason is that the first few lines of video have information on them for closed captioning beginning on line 21. If this wasn't overscanned people would be complaining about ragged edges on some shows as well as the moving white noise at the top of the image. Remember we are talking consumer HDTVs here and not computer monitors that do not overscan.
If this was a computer monitor it would be a bug, but since its not, it is normal and not a bug, but a necessary feature.
!