Setting up my HDTV as computer screen

FrenchAffair

Distinguished
Apr 29, 2008
24
0
18,510
I'm having some issues setting up my 32in 1080p samsung HDTV to my computer screen. I've got it all pluged and it works have been playing around with the settings for a week now and it still won't get right. Text is still showing up a little blury and some things are not fitting on the screen ect...ect.

Are there any guides how to properly set up the HDTV as the computer screen?
 

Crashman

Polypheme
Former Staff


It's a scaling issue. I had to set my TV to "Unscaled" but I'm not sure if YOUR TV has that option.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
If you are using composite set up you want to put it "unscaled/just scan/overscan" setting

If you are using a RGB/DVI/HDMI if you don't get one of those settings on the TV you want to match the output signal to the one the tv is tiring to use 1920x1080 or 1920x1088 as the screen size on the tv just mess around in your tv settings. Sometimes if you output 59Hz or 60Hz can matter too just mess around to get the best picture. :)
 

Crashman

Polypheme
Former Staff


Right, I should be more specific.

OK, there's a problem with HDTV's: They're designed WRONG. ALL of them.

The problem was underscan/overscan on ANALOG signals for old CRT TV's. Some idiot decided to incorporate overscan into HDTV's rather than address the problem through scaling correction, even though digital signals are capable of doing 1:1 pixel-by-pixel links. That is, an HDTV could have been made like a computer screen, but it wasn't because somebody decided to mix analog's problems with digital's solutions.

End result: A 720p TV is really something like 13xx x 768 rather than 1280x720. For some graphics cards, if you set the native resolution of the TV it doesn't work due to overscan correction which shouldn't even be there, and you must instead set 720x1280 and force the graphics card to overscan.

Now, if you have something that does work natively, such as 1920x1080 on a true 1920x1080 screen, there's STILL an overscan issue so you still need to adjust that down to zero in the graphics driver. But, because the TV is trying to use something other than "real" 1080p (say, 1900x1000 with overscan), it screws up things like text.

And that's where setting the TV to NO SCALING comes into play.

And everything I just said applies only to HDMI or DVI. Things get even more screwed up when using an analog connection.
 
Quick answer:
Use the PC input if possible.

Long answer:
HDTVs usually have a PC input or a Video input like HDMI. Some recent TV's have a PC-HDMI input which means your screen is being used as a monitor.

HDMI is normally a VIDEO signal which means you have to set your computer to ONLY a signal of 480p/i, 720p or 1080p/i. This is not what you usually want unless you use it only for movies. I plug my dad's laptop into his HDMI input and have then screen set to 1920x1080. Once i corrected for overscan everything's fine.

Many 1080p HDTV's have a VGA-PC input of only 1366x768.

Read your TV manual and see what resolution you can get up to on the PC-VGA input and use it (plus a 3.5mm audio cable) if possible.

HDMI from a computer often means some tricks with getting the audio to work too which ranges from not possible to a pain.
 
Aside from the settings in the gpu driver utility, you can also make adjustments to the Windows desktop, for example; icon size and spacing, size of the desktop text, and text shadowing. All of these types of settings are found in the Display Properties under the Desktop, Appearance, and Settings tabs.

Also, it would help to list your system specs and what video card you are using as well as the driver version. Just a thought...
 

Crashman

Polypheme
Former Staff
 

Homeboy2

Distinguished
Mar 21, 2006
736
0
18,990


Somebody forgot to tell the TV cause I just hooked it up and it works. Didn't touch anything. Maybe it has a genius I.Q.
 

Crashman

Polypheme
Former Staff


So what you're saying is that you either have a 1.) TV that detects it's connected to a computer rather than an external tuner, or, 2.) A computer that detects its connected to a TV rather than a monitor and automatically uses that worse settings that the TV expects. Right?
 

Homeboy2

Distinguished
Mar 21, 2006
736
0
18,990


nope, reread my post. I said I hooked my tv to use as a monitor and made no adjustment and everything works fine.
 

Homeboy2

Distinguished
Mar 21, 2006
736
0
18,990


You've never seen God either, doesn't mean he isn't there :whistle:
 

Homeboy2

Distinguished
Mar 21, 2006
736
0
18,990
Hey, Crashy, I do have a problem in which with your infinite wisdom, I'm sure you can help. When I try to install the latest drivers for a 8800 GTX it says im using a 32 bit uninstaller on a 64 bit system. a common problem I understand. Whats the easiest fix?
 

Crashman

Polypheme
Former Staff


Use the 64-bit driver if you're installing on a 64-bit OS? I've never seen that message, and I've installed a great many drivers!
 

TRENDING THREADS