Using a Samsung LED TV (1080P) as my PC monitor, 1920x1080 is very fuzzy, harsh, and ugly.

liljon0430

Distinguished
Sep 3, 2011
5
0
18,510
I've been using 1680x1050 on this 46 in. Samsung LED TV for about a year now. I'm using a little HDMI --> Mini-HDMI converter thing that goes into my graphics card mini-hdmi port.
TV: UN46D6050
Graphics Card: (NVIDIA) MSI GTX 560 Ti
OS: Win7 64bit Enterprise
When switching to 1920x1080 (remember this is a 1080P tv) it becomes really ugly compared to 1680x1050, like the text is kind of fuzzy, everything seems too bright and not right.
Does anyone know why? Besides the fact that TV's aren't the most reliable things to use as monitors.
Thanks.
 
Solution


*UPDATE*
In my case the solution was really quite simple. All i had to do was to change the...
Overscan settings or something else. Not too familar with the NVidia control panel over the AMD one, but when I just finally got a 1080p monitor, and HDMI cable, it looked small and crappy through my AMD card until I went through all the settings and stuff and got it perfect now.
 
Might be the refresh rate too, My old 42in 1080p tv/monitor was like that, needed like a 28Hz refresh rate to look good, I just ran it in 1280x1024 though and it looked fine. I guess the bottom line is that it could be many things makin' it look bad.
 

skongipaus

Honorable
Sep 24, 2013
2
0
10,520


Hi,

How did you solve this? I'm having the same issue with my Samsung 55'' LED TV.
 

determinologyz

Honorable
Sep 21, 2012
1,436
0
11,460


Back then ive tried to game on a 40" 1080p as a pc monitor and that was a mistake. Best to get around 23" 24" to get the best bang out of the resolution and screen and once 4k because the new standard 32" will be the sweet spot anything over the ppi lowers and looks kinda quirky...And people will say sitting further back looks better on 40" but it looks all big/blocky its hard to explain it but have a look at the video https://www.youtube.com/watch?v=7vt0CJcUi4w
 

prs-vox-guy

Distinguished
Sep 22, 2013
13
0
18,510
So is there a general consensus on cards that should be used to optimize hd tv performance..

Been asking for weeks got a new build coming up and want to focus on getting the proper gpu to run on my sharp 52 inch 1080p tv...
 

skongipaus

Honorable
Sep 24, 2013
2
0
10,520


*UPDATE*
In my case the solution was really quite simple. All i had to do was to change the name of the hdmi input I was using to PC. I guess this enabled "PC mode" on the TV. No problems with image running on 1920x1080.

On Samsung UE55F8005:
Source button - Highlight the relevant HDMI input - push the up button - Tools - Change name of input. There, I could choose PC from a long list of predefined names.

 
Solution

Reece251

Distinguished
Oct 10, 2013
4
0
18,510
Excelent last post that fixed my issue ive been trying to fix for ages thak you. Everything looked so crappy and after going through every setting on my pc and tv i found nothing until this answer. Its really weird how changing the name of a source can affect its quality.
 

bluscrn

Honorable
Oct 18, 2013
1
0
10,510
+1 for the above solution.

Changing the name of the HDMI input to 'PC' also fixed it for me (on a little 22" Samsung)

I'd never have figured that out... had been messing with all the other settings, too - and while I'd managed to improve it a bit, it was still fairly fuzzy.
 

Jake63

Honorable
May 11, 2014
2
0
10,510


you're a lifesaver, thank you!
 

justintaker

Reputable
Jul 21, 2014
1
0
4,510
I have same problem too.I don't understand what exactly you did?
please,explain more with details.how can you change the name of the hdmi input?
where can I find?I am beginner.
 

liljon0430

Distinguished
Sep 3, 2011
5
0
18,510


So first of all, make sure you are on the FIRST HDMI port if you are going to use it with your computer.
I was on HDMI2 and it was still blurry.
After that, go to the HDMI input,
press the input button to look at all of them, make sure HDMI is highlighted and press Tools then Edit Name, then select PC.
 

aL_Quarter

Reputable
Oct 1, 2014
18
0
4,510
i did what all of you guys did "changing the name to pc" but it still looks the same no big difference that i can see, its not perfectly clear.
im using a samsung 1080p tv 32 inch and im using hdmi1
 

HappyUserr

Reputable
Oct 21, 2014
1
0
4,510



I just came here to say I love you, bro.
 

Wind256

Reputable
Nov 9, 2014
1
0
4,510
Thanks for the tip! Really weird they have hidden the feature under a name.. can't no longer call xbox one's input "VCR" because everything is now "PC" ;)
This seems to make the picture "exact" to the signal and it looks like on computer LCD's and laptops now.. all pixels red ones included are totally sharp and in place, the saturation doesn't seem too big or too little and can't be even adjusted. game mode setting is grayed out and input lag actually feels even more non existent than it was having game mode on in other source names.. I have no way to measure that though but nice anyway.
 
Jan 22, 2015
1
0
4,510


You are a hero.
 

BobCharlie

Distinguished
Sep 2, 2011
221
1
18,710
I know this old but figured I'd weigh in as years ago I went through the same thing after upgrading from a DVI-D to full HDMI and was stumped initially.You guys are actually making a mistake and might as well just continue using the DVI-D connector instead with the way you are running it. I'll explain.

1st off, if you run your "LCD TV" as a "PC Monitor Input", like how it's run through the DVI-D connector, the PC actually treats the TV like it's a monitor and sends a signal for the TV to act accordingly. This is actually bad as it leaves image quality on the floor and severely LIMITS your TV enhancement settings. I'm running a 47" Vizio for example that's a few years old and has a crazy high contrast ratio, but it goes into a different "state" if it's getting the "I'm to act like a PC monitor" signal instead of "I'm to act like I'm displaying a Blu-Ray disc player" signal. You guys are forcing the "I'm to act like a PC Monitor" with changing it back to "PC Input" :heink:

For instance, it always bothered me I couldn't adjust Sharpness, Color, Smooth Motion, Noise Reduction, MPEG NR, Advanced Adaptive Luma (changes black contrast to reveal better detail on dark scenes with varying degrees of choices), etc. IF the TV was acting like a "PC Monitor" via the DVI-D connection. It only had a couple really basic adjustments via the remote IF the TV was acting like a PC monitor; like it had a really weak sharpness adjustment that didn't do much and I think a very basic color offset adjustment. THIS is why you want to run HDMI instead of DVI-D (unless you have a really old video card w/o HDMI) as it lets you make the same adjustments to the TV while running the PC as you could IF you were watching a TV station or a Blu-Ray movie. You are basically fooling the TV into thinking it's displaying a Blu-Ray player signal or similar. DVI-D signal vs. HDMI won't be any different quality-wise IF you just use the TV as a "PC Monitor" i.e. TV is getting the "I'm to act like a PC Monitor signal". Hope I drilled it in deep at this point with bountiful redundancy like running an HDMI cable to replace a DVI-D cable only to run the TV as a PC Monitor once more :pt1cable:

In my situation, I really wanted the sharpness adjustment from the "TV" settings, as well as the smooth motion option, darkness settings, and color adjustments which are ABSENT if you run the TV set as a "PC Monitor" and I'm guessing the same applies for your sets too given the fact they behaved differently just like mine did.

1. What you need to do 1st with the nvidia settings is manually set the TV to Native resolution, OR if you have the DSR option you can change it to (I'll use my Native resolution as the example) 1920 x 1080 "1080p" setting @ 59hz under the "UHD" option that gets created with DSR enabled drivers, and that will appear above the "Native" 1920 x 1080 @ 60hz "PC" option (just scroll up). In the nvidia settings, Native PC option @ 60hz is NOT considered "1080p". But the TV's own indicator thinks it's 1080p. If you want it to truly be progressive, select the "UHD" setting instead of the "Native" setting.


Now, for the VERY important step all of you had missed after upgrading to the HDMI cable. Go into the Windows Control Panel. Select the "Appearances and Personalization" header (What it's called in Win7. Dunno what later Windows show but find the heading with "Display" or similar). Look for "Display" or whatever has the "Resolution" and the option to "Make Text and other items larger or smaller".

2. Manually make SURE Window's Resolution Slider is set to Native resolution and matches nvidia setting and click apply! (if like mine, it won't say 1080p, just the actual 1920 x 1080 which is max resolution for this set)

3. Go to the "Make Text and other items larger or smaller" settings. If text and folders are huge and the "Larger-150%" circle is checked, set it to "Medium-125%. If it's still too big, set to "smaller-100%". Conversely, if text and folders are too small, set it to Medium and see if that's large enough, or else go large. The larger your monitor and higher the resolution, the larger you'll probably want to set this.

4. Congrats!! You can finally CORRECTLY use an actual HDMI input (name it anything you like other than "PC") and can now increase sharpness like you would a TV program, and any in-built options that were exclusive to the TV/Blu-Ray/PS4/Xbox/etc. settings are now able to be used with the PC and will look PROPER!!

I've found that having the sharpness slider available from TV settings available for PC games is like having an extra post processing sharpness adjustment that's much stronger than when the TV was using it's "PC Mode" slider. I can also enhance dark shadow detail, smooth motion works, and I think the backlight is fully adjustable whereas is wasn't before (been awhile, can't remember exactly) and I have full color control settings plus the noise reduction. Having the TV's sharpness slider unlocked in and of itself is more than worth the 5 minute hassle of changing the Windows settings. Game detail can now get super-crisp, and changing in-game AA or forcing it in nividia panel WILL work in conjunction with the TV's sharpness. Hell, even my video noise reduction on the TV's settings WORKS while playing a game. It smooths out overly sharp detail and removes jaggies if present depending which noise setting I adjust, but I prefer leaving it off for most crisp image.

So those of you still running your TV as a "PC Monitor" via HDMI are definitely missing out on better picture quality IF you are running it under the PC option and the TV restricts the better settings.

Word of caution. If you select DSR for your desktop, like say force 4k to be down-sampled to 1920 x 1080 by selecting the 4k option via the nvidia resolution option, it WILL make everything (text, folders) extremely tiny. Setting the "Larger-150%" option for example will help slightly, but everything will be "off". Like the arrow pointer will get huge and be a mm or 2 off where you actually pointing. This is a bug with the DSR option and Windows.

Anyhow, hope this info helps you get a better image; don't forget to increase sharpness via TV settings along with any other TV menu settings that you now have at your finger tips! When I was reading how you guys "fixed" it by switching back to a "PC" labeled input, I was cringing as you just effectively bypassed the entire benefit of running the HDMI cable in the first place, making it basically pointless. I personally couldn't tell a difference in image quality going from DVI-D (TV input) to DVI-D (video card output) vs. HDMI (TV input) to a DVI-D adapter (video card output) vs. HDMI (TV input) to mini-HDMI (video card output) with my older cards. However, HDMI to mini-HDMI with old GTX 550Ti card, and now HDMI to HDMI with new GTX 970 card, there's a HUGE difference visually being able to now run all the TV-based option settings with PC games, movies, 2D desktop, etc. vs. the old bare-bones PC options.
 

slickbutter

Reputable
Jul 14, 2015
1
0
4,510
This is just in case anyone stumbles upon this thread with a similar problem, I've had the blurry PC through HDMI problem for years now, on and off of course. If your TV doesn't have very useful firmware/menus (able to fix this problem with them), you need a piece of software called CRU, or custom resolution utility.

You open CRU, select your display, and then under "Detailed resolutions" select 1920x1080 or whatever the actual native resolution of the TV is. Click "Edit..." and make sure the refresh rate is actual at a perfect 60Hz. Now, press OK, and in the main window go down to the bottom of the window where there is a drop-down menu that has Extension block options, and select "No extension block." Click OK again and then run either the restart.exe or restart64.exe (depending on whether or not you have a 32- or 64-bit CPU/OS) and the Click Exit in the little window that should pop up when it finishes restarting your display driver.

Now, you can finally go to your Control Panel and select 1920x1080 and it should work now. If it ever is displaying incorrectly after turning the TV back on, just select a different resolution, apply it, then back to 1920x1080.
 

sairac

Prominent
Nov 2, 2017
1
0
510
Thank you very much, skongipaus. This solved my issue. I've been using my Samsung LED FullHD 46" (non-smart) TV as a monitor for some years now... And I had never set the HDMI's name to PC through the remote controller's source button... I also just created an account on Tom's only to say THANK YOU. Best, easiest solution... I had already re-installed the Nvidia drivers twice and was about to discard Windows 10... I'll just enjoy it now. Thank you man!