Sign in with
Sign up | Sign in
Your question

HDTV LCD 32" as PC Monitor - Fuzzy Text

Tags:
  • Tuner Cards
  • HDTV
  • LCD
  • Monitors
  • Graphics
  • Product
Last response: in Graphics & Displays
Share
June 21, 2009 11:27:47 PM

Hi, I recently switched from a 22" Hi-Def PC monitor to a 32" LCD HDTV as my pc monitor. Running desktop any resolution above 1600x1024 seems to make any text "fuzzy" or heard to read although it very much sharpens all pictures/video etc. Nvidia Control Panel supports up yo 1920x1080 although the text is almost unreadable in most cases. Although this goes for desktop, running games such as Left4Dead, FarCry2 look fine and there is no problem in any resolution although i only play in 1920x1080. Not sure if there is any way around this or if the high pixel rate is just shrinking text too much.

Additional info:

GPU: nVidia GTX 285

Cable used is a DVI to HDMI

All HD Formats supported with the setup

Windows Vista Ultimate

More about : hdtv lcd monitor fuzzy text

June 22, 2009 12:38:38 AM

the way video and text is displayed in windows is different, monitors are made to work with computer input thats why you can get a 40" tv with speakers for the same price as a 30" monitor. i think its the pixel pitch or something

tvs are made to show video so once you ave video running ofcorse itll run fine
text in video will also show fine but plain text might be an issue
June 22, 2009 12:46:35 AM

I have a 1GB Sapphire and a 1GB XFX ATI 4850 set up in crossfire using ATI's Catalyst Control Center (version 9.6). As you, I'm able to play games like Crysis at "high" settings with a resolution of 1920X1080 with ease. Additional specs on my rig, 700 W PSU, Intel i7 920 @ 2.67 GhZ, 8 GB DDR3 Corsair RAM. My rig is connected via DVI to HDMI on a 32" Sony Bravia XBR flatpanel TV (1080p). I don't have any problems with fuzzy text though. Maybe it has something to do with the NVidia Control Panel?

You can always adjust your font size no matter what resolution your desktop is at. Right click anywhere on your desktop, go to "Personilze" instead of your NVidia Control Panel. You'll see "Adjust Font Size (DPI)" in the right column, click it and a DPI Scaling screen will appear where you can adjust your text size using this tool.
June 22, 2009 2:57:17 AM

I tried DPI adjustment but even just slightly raising it made the font so big i couldn't fit a normal website on the screen at 1980x1020. I guess ill just keep desktop at 1680x1050. Not a real bother but the higher res does sharpen things up nicely/
June 22, 2009 4:49:04 AM

xaira said:
the way video and text is displayed in windows is different, monitors are made to work with computer input thats why you can get a 40" tv with speakers for the same price as a 30" monitor. i think its the pixel pitch or something

tvs are made to show video so once you ave video running ofcorse itll run fine
text in video will also show fine but plain text might be an issue



bro, im planning on getting a core i7 with at least one gtx 275, and linking it to a 40 - 55 inch 1080p TV, not sure if its plasma or LCD yet, so does tht mean tht games n movies will look fine, but other normal computer stuff has a good chance of looking like ***? cause i want to run windows at the full hd resolution...

thnx.
June 22, 2009 5:08:33 AM

thegewch said:
Hi, I recently switched from a 22" Hi-Def PC monitor to a 32" LCD HDTV as my pc monitor. Running desktop any resolution above 1600x1024 seems to make any text "fuzzy" or heard to read although it very much sharpens all pictures/video etc. Nvidia Control Panel supports up yo 1920x1080 although the text is almost unreadable in most cases. Although this goes for desktop, running games such as Left4Dead, FarCry2 look fine and there is no problem in any resolution although i only play in 1920x1080. Not sure if there is any way around this or if the high pixel rate is just shrinking text too much.

Additional info:

GPU: nVidia GTX 285

Cable used is a DVI to HDMI

All HD Formats supported with the setup

Windows Vista Ultimate

By any chance are you using a VGA cable?
a c 78 C Monitor
June 22, 2009 5:34:49 AM

You want to be using the PC input.

This usually means the VGA (analog input) but some newer HDTV's now have an HDMI-PC input.

If it's a normal DVI or HDMI video input then you must choose either 480i, 480p, 720p, 1080i or 1080p. Assuming you have a "1080p" (1920x1080) you would choose 1920x1080 and go into your ATI or NVidia settings to ensure overscan etc is setup correctly so the screen fits perfectly.

Using the video input (not PC input) is not ideal. You are stuck with that resolution and widescreen ratio only. In fact, if you had a 1360x768 HDTV there is no video setting that will scale correctly as "720p" (1280x720) scales to give fuzzy text as does 1080p.

The best setup is HDMI-PC if you have full audio out of your HDMI output or a receiver to process audio. "HDMI-PC" allows for all resolutions to be used.

VGA-PC or HDMI-PC on a 1080p HDTV allows:
-800x600 up to 1920x1080 (including 1024x768, 1280x720 etc)
a c 78 C Monitor
June 22, 2009 5:43:58 AM

Seerwan,

Look into the following:

1) A SONY LCD HDTV that has HDMI-PC input

2) The NVidia GTX 275 or other card that can take in SPDIF audio and output it via HDMI

3) an Auzentech card (Prelude or Forte X-Fi)

As far as I can tell, the above setup will give you full audio and video output from your HDMI connector which would go into the HDMI-PC input which in turn allows different screen ratios for games (especially if game is NOT widescreen).

You should leave the resolution at 1920x1080 always.

Not all games are ideal for HDTV's due to the distance. Half-Life 2 or Crysis sure, but Diabo III is best close-up on a high-res monitor. The best monitor for gaming Diablo III until OLED monitors come out is a 19" CRT @ 1600x1200 @ 85Hz. Second best is a 22/23" 1920x1080 or 1920x1200 res monitor with 1000:1 contrast and good viewing angle.
June 22, 2009 6:04:54 AM

photonboy said:
Seerwan,

Look into the following:

1) A SONY LCD HDTV that has HDMI-PC input

2) The NVidia GTX 275 or other card that can take in SPDIF audio and output it via HDMI

3) an Auzentech card (Prelude or Forte X-Fi)

As far as I can tell, the above setup will give you full audio and video output from your HDMI connector which would go into the HDMI-PC input which in turn allows different screen ratios for games (especially if game is NOT widescreen).

You should leave the resolution at 1920x1080 always.

Not all games are ideal for HDTV's due to the distance. Half-Life 2 or Crysis sure, but Diabo III is best close-up on a high-res monitor. The best monitor for gaming Diablo III until OLED monitors come out is a 19" CRT @ 1600x1200 @ 85Hz. Second best is a 22/23" 1920x1080 or 1920x1200 res monitor with 1000:1 contrast and good viewing angle.


thanx, photon.
1 - i was thinking of Sony... all their TVs r LCD, pairs better with pc's than plasma, generally...
2 - ok.
3 - is a dedicated sound card necessary? wont the onboard sound b enough of what shall b an Asus or Gigabyte mobo?

hmm.... gr8, thanks for the info.
June 22, 2009 2:58:04 PM

try changing the dpi settings in windows
June 22, 2009 6:05:38 PM

to Xaira, yes as stated above i did adjust DPI but even a small increase had some text much too big while other text was unchanged.

and Stranger no i didn't state it looked like sh*t, its more just a slight annoyance but not nearly enough to make me wanna jump back to the 22" monitor

no I'm not using a VGA cable, as a said I'm using a DVI to HDMI otherwise I can't see 1080p or other HS formats working on a VGA cable, and yes as long as your video card has SPDIF support you can run full audio through HMDI: if so don't forget to enable HDMI in your Playback tab in Sound.
June 22, 2009 7:12:25 PM

Update: After some more research I've found that my TV specs says "Supports up to 1366x768" which my guess would explain why higher res become fuzzy. Oddly nVidia Control Panel claims my native res is 1920x1080 and keeps wanting to default it to that. I've read about forcing resolution but I'm not sure if forcing 1920x1080 is the same as just setting it int he nVidia control panel, just wasn't sure if the panel itself was "forcing" the res.

If it is not forcing then would forcing res be equally clear text as lower res? Also will forcing res damage the monitor?9
June 22, 2009 7:33:25 PM

force it to 1366x768 and enjoy

so people whose tvs actually support 1920*1080 u shud b fine
June 22, 2009 7:56:44 PM

thegewch said:
Update: After some more research I've found that my TV specs says "Supports up to 1366x768" which my guess would explain why higher res become fuzzy. Oddly nVidia Control Panel claims my native res is 1920x1080 and keeps wanting to default it to that. I've read about forcing resolution but I'm not sure if forcing 1920x1080 is the same as just setting it int he nVidia control panel, just wasn't sure if the panel itself was "forcing" the res.

If it is not forcing then would forcing res be equally clear text as lower res? Also will forcing res damage the monitor?9


I wouldn't use the force option (IMHO). I was playing around with it on the Catalyst Control Center (CCC, version 9.5) when I forced a certain resolution (not sure of the
? X ?), it screwed the CCC up. It took options like overclocking and crossfire away from the CCC. I had to uninstall CCC and re-install it. Luckily I was going to update to version 9.6 anyway.

Maybe it's a problem with ATI drivers or CCC or just my rig, I'm not sure but I wouldn't try to force anything. If a resolution is not an option, I wouldn't try to force it. BTW, it didn't hurt my Sony in anyway, though. Although I'm sure it won't hurt your flatpanel, it may screw up your NVidia control panel.....just guessing.
June 22, 2009 10:54:10 PM

well 1920x1080 is an option in the control panel, that's what i have been running since i hooked it up and that's what nvidia control panel sets as my native res.
I wasn't sure if the control panel forces the res being that 1920x1080 sets as default. I'm just not experienced with "forcing" any resolution.

Everything is totally readable/visible, no weird lines or anything of the sort. Just was slightly blurry and i didn't know if there was a workaround that.

Although the TV supports "up to" 1366x768 its just much too big especially on a larger screen, feels more like 800x600.




June 23, 2009 2:47:49 AM

if you use hdmi and don't use your tv's native resolution, text will always be somewhat fuzzy, so use that res if you are doing anything besides playing or watch videos. That native res seems odd though, but I don't know much about plasma, only LCD
June 30, 2010 12:00:58 PM

Use your TV's native resolution. Anything else would make text appear fuzzy. Hope that helps.
a b C Monitor
June 30, 2010 4:04:17 PM

This is a common problem HDTV with high refresh rate (i.e. 120HZ). Go to your TV setup and assign the digitial input (HDMI) to PC interface mode. That will fix your blurred text problem.

new HDTV with high refresh rate re process the video data to enahnce the image. Blue ray disc or HD TV shows are recorded at 24HZ 1080P. The TV video processor re-process the data to enhance the video quality. This process is called Interpolation.

For PC interface the required refresh rate of the HDTV is 60HZ. No interpolation is required.

I have 3 HDTV setup with 3 PCs (one on GTX 280C-73 inch, One on 8800GT-52 inch, One on CF 4890-70 inch). On all 3 the HDTVs are set to PC interface mode. All the 120HZ or 240 HZ feature of the HDTVs are disabled. That fixes text blurring. All 3 setup use HDMI interface to HDTV.


!