I recently inherited an old LG Plasma (DU-42PX12X, to be precise) and I'm in the process of building an HTPC for it. In the meantime, I hooked up my main CPU to the TV to get a feel for what it was cabable of (as far as Win7 MCE is concerned).
I can't seem to wrap my head around how the TV (a 1080i) and GPU interact, and thought maybe you all could help. The manual suggests setting the PC resolution to 1024x768 - my head already hurts, because this is a widescreen TV, and that's a relatively square dimension - I obviously don't want distortion. Furthermore, what little I can comprehend from the manual, it looks like it says I should hook up both VGA and DVI simultaneously, if possible. The video card is an old EVGA GeForce 7600GT with two DVI outputs.
So I guess I'm wondering about the cabling and the resolution, mostly. I've managed to get it to display some stuff on the TV, but it always looks pretty bad, and is cut off around all the edges. I guess I'm also wondering if I even have the right DVI cable, as I understand there are several varieties (with various combinations of digital and analog).
The manual is here, and I'll try to transcribe some important parts that I just don't understand:
This TV provides Plug and Play capability, meaning that the PC adjusts automatically to the TV's settings.
The TV perceives 640x480, 60Hz as DTV 480p based on the PC graphic card. Change the screen scanning rate for the graphic card accordingly.
To get the best picture quality, adjust the PC graphics card to 1024x768, 60Hz.
Use the TV's RGB INPUT or DVI INPUT port for video connections, depending on your PC connector.
If the graphic card on the PC does not output analog and digital RGB simultaneously, connect only one of either RGB INPUT or DVI INPUT to display the PC on the TV.
If the graphic card on the PC does output analog and digital RGB simultaneously, set the TV to either RGB or DVI; (the other mode is set to Plug and Play automatically by the TV.)
I know this is noobish - I'm just beginning my home theater adventure, and if you'd rather recommend me to a place to read up and educate myself, that's fine...
Oh, shameless plug for my other thread, in case you have any ideas for a full 1080p GeForce 9300-based HTPC...
Well, I obviously didn't get too many bites on my first post, but maybe this follow up will:
I discovered that the GPU resolution that worked best for the TV was 1280x720 using the TV's DVI port. Crystal clear, yet cut off on the edges. My TV manual said that it had an "adjust" feature (to help with cut off edges), but it only worked on the RGB input. However, when I use the RGB input, my GPU will only let the resolution go to 1024x768 (GRRR)! I can't even use a custom resolution!
Which brings me back to the original question: am I supposed to be using the DVI and VGA inputs simultaneously in some sort of "tandem" for the TV?
Also, I used a little 15" monitor to get the TV up and running...so the TV is set as screen 1 (DVI, 1280x720) and the little monitor is screen 2 (VGA, 1024x768). Of course, now I'd like to disconnect the monitor, but every time I do, the signal to my TV stops cold. So I have to have the monitor present, even though it's screen 2, just to run the TV? Ridiculous!
I have the latest nVidia drivers. Thanks for any suggestions...