Using TV as my display - what do I need to know?

Palidion

Honorable
Jun 19, 2013
86
0
10,630
So I'm wanting to use a 40" TV with my gaming PC down the road. I've heard that mixing certain TV's with certain computer components can cause big problems. Basically I want to know what to avoid doing and what I'll need to do to still enjoy good definition and frame rate on a TV, rather than a monitor. Is it even worth bothering with a TV for computer gaming?
 
Solution


More memory on your GPU is only useful if running at high resolutions or with multiple screens. For a single, 1920x1080...

Neospiral

Honorable
Jun 28, 2013
383
0
10,960
There aren't any big problems connecting computers to TVs, unless you're using really old stuff.

These days as long as your computer has HDMI out and your TV has HDMI in, it's that simple. What TV and graphics card are you planning to use?
 

Palidion

Honorable
Jun 19, 2013
86
0
10,630

1080p 120Hz Samsung TV (will be brand new and from this year) and a GTX 770. I just don't want to drop $1000 bucks on a TV and a GPU and fry something because they aren't meant to be mixed for gaming. Also, I read that VRAM is more important when doing arrays of monitors for your display setup. Is VRAM important when talking about bigger screens as well? Or no?
 

Neospiral

Honorable
Jun 28, 2013
383
0
10,960


More memory on your GPU is only useful if running at high resolutions or with multiple screens. For a single, 1920x1080 screen, it's not a concern. That TV and that graphics card will work fine together. Whoever told you that you can damage components by connecting them to the wrong TV is ignorant. It might have been possible to damage a TV about 10 years ago when CRT was still around and fairly popular, but not now. The worst that will happen is that it just won't work.

One thing to note though: 120 and 240Hz TV's don't accept any refresh rate higher than 60Hz. The 120Hz is an internal panel number which allows the 'smooth motion' effects you see on those sets, but the video processor can still only read 60 fps.
 
Solution
120hz TVs don't actually support 120hz input the way 120hz monitors do - a 120hz signal for higher effective framerate or 3d with shutter glasses requires dual-link dvi or displayport, which TVs usually don't have.

The physical size of the monitors/tvs makes no difference to the graphics card, only the resolution.
 

OcelotRex

Honorable
Mar 4, 2013
190
0
10,760
I use my 42" Toshiba and my 50" samsung both as monitors for my 2 HTPCs. Each one has a video card with HDMI out. That GTX 770 should be compliant with the most recent HDMI standard so you should be fine to use it without damaging either the computer or the TV.

That being said the one issue that I have with HDTV's as monitors is getting the scaling right. Windows 7 & 8 offer scaling in the display properties of 100, 125, and 150%. At the viewing distance I use my computer as I prefer 150% scaling as it makes everything bigger at 12ft away.

There are some programs that act finicky at this scaled option though. Steam's "Big Picture Mode" does not hide the taskbar; full-screen videos on YouTube have the same effect. IE works perfectly and is my go to browser for YouTube and HBOGO on the scaled computers. Windows 8 is nice as a HTPC due to the native apps like Netflix that work great on a HTPC.

There should be no problems gaming at 1080p with that video card. You can always pick up a Xbox 360 wireless controller if you don't want to use Keyboard//Mouse.
 

Palidion

Honorable
Jun 19, 2013
86
0
10,630
Thanks for the info and reassurance. I've always liked the power and game versatility/cheapness of PC but preferred the comfort of sitting back on a couch and feeling more relaxed while playing games. I've been using a console for gaming so long it is what I'm used to. Anyways, thanks again for the help :)
 

V@no

Distinguished
Feb 26, 2011
18
0
18,510
In my current experience HDMI cable also play important role. I got a cheap one and picture on tv is overscanned, meaning it doesn't fit on screen and edges all around are out of frame, losing about 40 pixels on each side. Changed cable and viola everything is fine.

Another issue I have is that I connect PC to a home theater for 5.1 sound, (TV plugged in to it too) and the problem with this setup is that I must turn on the home theater receiver before I turn on PC, otherwise PC would not recognize it and it will have to be restarted. But I'm pretty sure it's the receiver issue, connecting directly to TV works fine.

As of mentioned pixel size issue, unless you are using it as monitor and sitting 4' away from it it's crystal clear and perfectly readable from a couch.
 

OcelotRex

Honorable
Mar 4, 2013
190
0
10,760


I wanted to add that you can buy affordable HDMI cables just make sure that they are HDMI 1.4 compliant like the video card youve chosen. If the new TV, video card, and cable are all 1.4 compliant you shouldn't have any over or underscan issues that cannot be solved with the driver software.
 

OcelotRex

Honorable
Mar 4, 2013
190
0
10,760


Edited for weird double post
 

Neospiral

Honorable
Jun 28, 2013
383
0
10,960
Usually overscan is a setting on the TV, and has nothing to do with the HDMI cable or the GPU drivers. And "cable quality" doesn't affect the picture that comes through an HDMI cable at all, provided the cable conforms to the latest revision, HDMI 1.4a, as OcelotRex said. When the guy at the store tries to tell you that the $50 cable is better than the $15 cable, and they're both 1.4a, laugh at him and buy the $15 one.

There are some drivers which include a "TV mode" of some kind which will attempt to correct picture scaling if it knows you've connected the card to an HDTV, but for all intents and purposes, since the advent of 1920x1080 TVs with all digital inputs, these modes are completely unnecessary and have been removed from current nVidia and AMD drivers as far as I know.

Just connect it to the TV like you would any 1920x1080 monitor, and make sure any scanning settings on the TV are turned off/set to native.
 

OcelotRex

Honorable
Mar 4, 2013
190
0
10,760


Not to argue but I would like to give my personal experience -

I have an AMD card hooked up to a 2012 model Samsung TV through HDMI. When I first connected it to the supprted resolution (per checking the manual) for 720p resolurion Catalyst Control Center set the Scan setting to 115%. This also defaults when I updated the drivers. A quick trip into CCC and moving the slider down to 0% Overscan/Underscan fixes the issue.

My other HTPC is hooked up to my 2008 model Toshiba LCD TV also through HDMI. I am using the supplied Intel HD 4000 graphics on my i5-3570k - it has never had issues with 720p or 1080p resolutions with over or underscan.

Maybe this is just an issue with the legacy catalyst drivers, my PC, or my TV but the solution is always fixed through the AMD software not on my TV.