I just bought a 55" LCD projection 720p television. I have connected the computer to the television throught my bfg 7800gt video card using component cables. I was expecting a little better quality. The picture is very dark and the screen is pixelated. I have made sure that the resolution is at 720 and adjusting the brightness and contrast just washes it out. The picture is great on tv when it is not connected to the computer. Please help.
I would think you would get a half decent picture on a 720p set, nothing nearly as sharp as a 1080p or PC monitor though. Not in Windows at least, viewing videos should be excellent though. I would expect some hard to read text for example. How bad is the picture?
Perhaps the 7800 is not outputting properly. They can be a little flaky when driving a secondary monitor such as an HDTV. Are you using the TV as the primary monitor or secondary? I use my 1080i set as a secondary. Movies and videos are spectacular but small text is unreadable. My nvidia card is quite flakey whem driving the TV, sometimes I have to go through the dual monitor setup wizard several times to get it to work.
You are using the nvidia control panel to set it up right? I'd trust that over the windows 'settings' menu.
You might try reinstalling the nvidia drivers and dl the latest drivers while you are at it. Hook up the TV and then try to get Nvidia control panel to detect it.
I use a hdtv 720p class, but i dont use component or hdmi, the card (ati x800pro) didn't like the tv style injection (feeds) use the vga port if it has one, if it doesn't consider getting another tv (i had to just to get an image that actually used the 1360x768 capabilites of the screen), if the card sees it as a tv it will try to fake a standard feed and you will get the image problems, including a lesser resolution
Well I am using the tv as a primary monitor and I am using the nvidia control panel. The tv only has hdmi and component fo hd. I thought about using a dvi to hdmi cable but I am reading on the net that it wont make a difference. I am able to read text just fine, I am typing this on the tv, but the picture is just dark and it looks like I am limited to 16 bit color or even lower althought the settings say 32 bit. Here is a list of my specs.
AMD 64 3200+
Corsair XMS 1GB
2 BFG 7800GT OC in SLI
Sound Blaster Audigy 2 ZS
Asus A8N-SLI Deluxe
200Gb Hard drive
Hitachi 55" LCD Projection 720p 16:9
I'm guessing that the tv has a setting in it's menu that chooses between standard video and pc monitor - my Viewsonic 37" LCD had a setting like that. Except it had weird names for those settings - but that's what they amounted to.
Edit: I've used DVI to HDMI cables and the picture was perfect, it may be worth giving one a try.
I am set at 1280x720. I reinstalled the drivers. When it boots up it is a perfect picture even through the windows "loading" screem. As soon as windows starts it is dark. Oh I am running windows vista ultimate.
The problem you are seeing with the image going dark is a black level clamping problem. The TV clamps to a voltage and sets it as the reference black level. If the clamping is done in the wrong place such as during active video it will set that as black and make the image dark. The way to fix this is to change the timing of the video signal so that clamping is done at the correct position of the video signal. Since I don't have the same video card as you, I can't tell you exactly what to change, but what you should try first is the adjustment in your video card's control panel to move the image to the left and right. Try both way and see if this fixes the problem. This may leave the image off center but should be correctable in the TV itself. If you go and use the DVI to HDMI adapter this should resolve the problem automatically because the TV will be able to send the EDID information to your computer and allow it send the correct timings to your TV.
HDMI is a digital signal. So is the video signal your video card produces, and the video signal your LCD RPTV dispays.
Component video cables cannot transmit digital signal only analog. So the digial signal your video card produces, has to be converted to an analog one, then when it reaches the TV, it must be converted back to a digital signal for display.
This does not seem like the cause of your problem, but I do think it's advisible to upgrade your connection method.
Definitely try the DVI-D to HDMI connection. Because the component (analog) connection determines intensity based on voltage levels, it's possible for your card to put out levels that differ from those expected by the TV. This won't happen with DVI-D / HDMI. Also, your card needs to add horizontal and vertical retrace intervals to the component output signal, which if they are slightly off then your TV might not interpret the signal properly. HDMI will not have this problem, because the gfx card tells the TV exactly what information it is going to be getting.
Thanks for the help. I just ordered a dvi to hdmi cable from newegg. I am using the nvidia control panel I can't seem to find the settings to change the timming. The user manual says that it is under display settings but it is not there. Thanks
After taking a look at the latest drivers for your video card and the fact that you are running Vista you most likely don't have any control over timing like you did in the windows XP drivers.
I would go with the DVI to HDMI converter cable as your best bet for resolving your problem.
With the Nvidia Control panel there is no way to adjust timing values. However there is a program called RivaTuner http://www.guru3d.com/index.php?page=rivatuner
that should allow you to adjust timings. It is a powerful program and should be used with caution. Since you already are getting the adapter that is the best way to go. There may be one problem with the HDMI adapter and that is overscan. Your TV may decide that now it has a real HDTV signal and overscan the image. HDTV overscan is normal. However in is not desirable on a computer source as you will loose info on all edges.