Sign in with
Sign up | Sign in
Your question

GTX460 OC + 32" LCD HD TV

Last response: in Graphics & Displays
April 13, 2011 9:33:12 AM

Hi guys :) 

I recently put a new gpu in my pc which is a PNY GeForce GTX460 OC Edition. Which has 1xminiHDMI out and 2 DVI outs.
I'm using a 32" LDC HD TV as my monitor which measures roughly 15.5inches vertically and 27.5inches horizontally.

When I use DVI - DVI the picture and color quality is great but literally NO resolution works, not even the native or recommended ones.

When I use miniHDMI - HDMI (got an adaptor with the gpu) The resolutions fit a lot better but the color and picture quality is bad...

Any advice? Thanks :) 

Additional info;

OS - Windows XP PRO 64bit edition (please don't comment) :p 
Here's a link to the TV;

And here's a link to see what the display is with the miniHDMI - HDMI;

(Y) Much love

More about : gtx460 lcd

April 13, 2011 10:19:16 AM

Have you tried 1280 x 720?
a b Î Nvidia
a b x TV
April 13, 2011 1:59:27 PM

*There are TWO options when selecting video outputs on PC graphics cards:

This is the normal PC video output which allows you to change resolutions. On an HDTV this would require a special PC input which is often only VGA, sometimes DVI and occasionally HDMI. This PC input passes into the same video scaling chip uses for computer monitors enabling 640x480, 800x600 etc up to 1920x1080. (My older 32" Sony can only play video games using a VGA cable for video plus a 3.5mm audio cable. It works quite well though. I could choose to use a normal HDMI cable but there's no audio output for games.)

This is the normal video input for HDTV's. In this mode you don't have choices for different resolutions. You select one and stick to it. If your 32" television has a maximum resolution of 1920x1080 then choose 1080p. If it has a maximum resolution of 1366x768 choose 720p if you intend to play video games otherwise choose 1080p. (720p is 1280x720; since the actual resolution is 1366x768 you'll get a SMALL increase in quality by choosing 1080p. 13%. However, if you choose 1080p for games it requires much more processing which is then scaled back down again. So it's better to enable higher quality at 720p then to use lower quality at 1080p)

In "VIDEO" mode you will need to use your NVIDIA control panel to ensure your screen fits properly to avoid blurring.

Again, once you have chosen your resolution and have scaled things properly you will never change the resolution. Games will offer ONLY that one resolution. If you wanted to use 1280x720 instead of 1920x1080 in a game you would need to change that from the NVidia Control Panel as the game ONLY supports the single resolution chosen. Typically you'd never choose anything other than 720p or 1080p (sometimes 1080i for older HDTV's with no 1080p option).

My dad's laptop has an NVIDIA graphics chip and I hooked it up to his HDTV via an HDMI cable. I went into the proper video setup section (not the normal monitor display section), chose 1080p60 (NTSC) and then adjusted the scaling to fit properly (not sure what it's called, there were coloured lines to indicate when the screen fit properly especially at the corners).

*I noticed in my AMD setup that it had the option to add the video settings such as "1080p60 (NTSC)" to the Display Manager which normally only shows PC-Video choices for monitors.

**If your HDTV has a PC-Input that supports your screen's full resolution you may wish to use that. Unless you play video games it's best to use a normal HDMI cable and set it to 1080p. Most video games now are widescreen anyway so 1080p is still a good option.

***AUDIO: Laptops tend to support HDMI audio as they are designed for it. I think most desktop graphics cards with HDMI output only support a limited number of audio codecs such as Dolby Digital (AC3). It's my understanding that gaming sounds and normal Windows sounds are not supported. If you want audio for games you may need to run the audio output (analog or digital) from your onboard audio or audio card into a RECEIVER which sends audio to external speakers. (the exception, as mentioned is to use the PC video and audio input if available. DVI + 3.5mm audio is common now.)
April 13, 2011 4:24:47 PM

Cheers for the replies guys!

I'll give them both a go and report back