Sign in with
Sign up | Sign in
Your question
Closed

1080 p TV HDMI to HDMI connected to pc = bad quality

Last response: in Graphics & Displays
Share
July 29, 2009 9:16:57 PM

Hey Guys.

So I just got myself a new 40" Samsung LCD tv and want to plug it into my pc, so I'm able to watch blue ray movies in full HD on the big screen. But as usually it comes with problems. As the topic says, I'm having extremly low quality on the tv.

I'm usting a HDMI to HDMI cable. My videocard is a Geforce GTX 295, with 2x DVI ports and 1x HDMI port.
My television has 3x HDMI outputs and a DVI.

So I tried pretty much everything. Been reading several topics, where people had the exact same problem, but unfortianely none of those few answers have worked for me yet.

The picture is very distorted and the colors are flattened a lot. The text itself is eye raper, but readable. Around all the text, is some sort of white marking. No matter what resolution I set it at, it remains and so do the bad quality.

My tv is a Samsung LE44B535 Full HD 1080p.

I hope that there is some of you that know what could occur this problem, because otherwise it would be a bit of a waste of money :) 

On beforehand, thanks.
July 29, 2009 9:41:21 PM

Are you sure you set the resolution to 1920X1080?
July 29, 2009 9:48:16 PM

dzeric said:
Are you sure you set the resolution to 1920X1080?


I'm 100% sure. Both Nvidia control panel and windows display manager calls it for 1920x1080.
Related resources
a b x TV
July 30, 2009 12:03:16 AM

How is your PC connected to your TV HDMI, DVI, VGA, Svideo?
What Playback software are you using for Blue Ray or HD movie?

Detail your hardware and OS so you will get a more definitive response from this forum.

Getting high quality 1080P video and pc workspace in Samsung HDTV and or LCD Display is almost a sure thing. Something is "Unique" in your setup resulting to not acceptable results.
July 30, 2009 5:56:11 AM

Maybe you could try another HDMI cable. I have hooked my puters up to both my 47" Philips and Samsung LN52A750 without any issues using both my i7 with HD4670 and my e8400 with both a HD2400 and a HD3870. The main reason I use ATI cards is the on card sound (to get the passthrough on my i7 I would lose a slot since it needs a riser card to use the onboard and I have plans for my one empty one now).
But with all those combination I never had any problem with a poor pic and my main PC is connected to my 47" Philips with a 25' HDMI in my bedroom.

One thing I will say is that print looks better on the Philips while everything else looks far better on the Sammy. The pic on the Sammy is just plain great playing 1080p material. I really hope you do get it figured out and once you do I am sure you will be very satisfied.

But other than trying a different HDMI cable I really can't of any solution.
July 30, 2009 9:30:45 AM

leon2006 said:
How is your PC connected to your TV HDMI, DVI, VGA, Svideo?
What Playback software are you using for Blue Ray or HD movie?

Detail your hardware and OS so you will get a more definitive response from this forum.

Getting high quality 1080P video and pc workspace in Samsung HDTV and or LCD Display is almost a sure thing. Something is "Unique" in your setup resulting to not acceptable results.


Thanks for the answer.

The cable is connected via HDMI to my TV. So it's a 2way HDMI cabel. I also tried all 3 HDMI plugs, but same result.

I don't have a blue-ray player yet, but I was hoping to by a blue-ray disc for my computer, instead of a stand-alone blue-ray player, since the one for the computer is much cheaper. And for HD movies I use windows media player. Don't really know any other free players that are just as good or better.

PC spec is:

CPU: INTEL i7 920 (SOCKET 1366) 4.8GT/sec
GPU: * 1792MB DDR3 GEFORCE GTX295 (x16 EXPRESS 2.0)
Motherboard: * GIGABYTE EX58-UD3, (INTEL X58) (x16 EXPRESS 2.0)

RAM: * 6GB DDR3 1333MHZ RAM *TRI-CHANNEL* 3x2GB KINGSTON
DVD DISC: * 22x SATA DVDBRÆNDER +/- OG DVD-RAM
PSU: * 750W "GAMING" STRØMFORSYNING, SILENT, 17DB
Hard Disc: * 1000GB SAMSUNG HARDDISK, 7200RPM, 32MB CACHE (1TERABYTE)

and my OS is Windows 7 RC build 7100. I do have Vista premium on my laptop, but unfortianely no HDMI output from my video card :/  and the screen can only run 1280x880.

I also tried switching between the Dynamic, Standard and Movie lightning option. When using movie option, I get a better picture, but then the picture is just way to dark.



TO Ancient_1:

I doubt it's the HDMI cabel, since I bought it a week ago. Beside that the signal is fine, just the picture that's awful. I also tried connecting my previous tv to my computer (however that was also my previous computer) and it worked fine. Was with a DVI output instead of HDMI to HDMI.
a b x TV
July 30, 2009 1:19:00 PM

"My tv is a Samsung LE44B535 Full HD 1080p." Could not find this model on Samsung webpage. Can you check again and provide the TV model?

a b x TV
July 30, 2009 1:31:11 PM

As Ancient_1 suggested there is no harm trying another cable. How long is your cable? I'm using an 25 ft HDMI-1.3 cable. I don't have any issues with it on multiple cards. I have tried the following video cards on my 70" HDTV (& Different LCD Display) and have not issues at all at 1080p (8800GT OC 512, GTX 280 OC, ATI 3870, ATI 4870, ATI 4890 XOC).

Have you tried the DVI ports to your TV from GTX-295?

Again please reconfirm the model of your Samsung HDTV.
July 30, 2009 3:11:24 PM

leon2006 said:
As Ancient_1 suggested there is no harm trying another cable. How long is your cable? I'm using an 25 ft HDMI-1.3 cable. I don't have any issues with it on multiple cards. I have tried the following video cards on my 70" HDTV (& Different LCD Display) and have not issues at all at 1080p (8800GT OC 512, GTX 280 OC, ATI 3870, ATI 4870, ATI 4890 XOC).

Have you tried the DVI ports to your TV from GTX-295?

Again please reconfirm the model of your Samsung HDTV.


Thanks for the answer :) 

My cable is 5m (approx. 15 ft) long. I could try another cable, but they're just rather expensive here in Denmark, but I'll try to find one I can borrow :) 

I did as you said, trying the DVI ports and it works like a charm. The perfect is perfect. But now the tv is also the only monitor running, so my computer monitor is turned off. And it was automatically also set to 1920x1080 at the moment I plugged it in.

Regarding my tv, I made a mistake writing the wrong number :)  The correct model is: LE40B535, 40 instead of 44. My bad:) 

But as far as I know, the HDMI solution should be a newer and better technology than the old DVI, so weird that the DVI works but not the HDMI ?
a b x TV
July 30, 2009 4:22:46 PM

Do you have a DVI-HDMI adapter? You stated your video card have a built in HDMI. I just want to try to see if you can make it work thrugh this connection.
July 30, 2009 4:45:22 PM

leon2006 said:
Do you have a DVI-HDMI adapter? You stated your video card have a built in HDMI. I just want to try to see if you can make it work thrugh this connection.


Yes I do have a DVI to HDMI adapter. I tried pluggin it in, but still the same problem. The picture is unclear and not that good at all.
July 30, 2009 4:58:13 PM

You said your rez is set to 1920x1080, but what refresh rate are you running at? 60Hz, 30Hz?
July 30, 2009 5:27:01 PM

rwpritchett said:
You said your rez is set to 1920x1080, but what refresh rate are you running at? 60Hz, 30Hz?


Running at 60 hz. Also tried differen types, 50hz, 25hz, 30hz etc. no changes.
a b x TV
July 30, 2009 9:08:37 PM

Try a new HDMI cable. If you said DVI works but not HDMI. It could be cable related.
July 31, 2009 8:51:33 AM

leon2006 said:
Try a new HDMI cable. If you said DVI works but not HDMI. It could be cable related.


Right, I try a new HDMI cable. I'll try to borrow one for someone :)  I'll return when I tried, to see if it worked or not.

Thanks for the help leon :) 
August 1, 2009 2:17:32 PM

Okay so I tried with another HDMI cable, this one only 2,5 m long. Same problem still occurs. What can possible make the problem then? Could be the tv that is unfunctionel?

a b x TV
August 2, 2009 1:57:50 AM

Can you verify which setup works:

PC TV
DVI DVI @ 1080P@60HZ
1080P@24HZ
1080 30HZ
HDMI HDMI @ 1080P @ 60HZ
1080P @ 24HZ
1080 30HZ
DVI-HDMI HDMI @1080P @60HZ
1080P @ 24HZ
1080 @ 30HZ

I'm not sure of the scan freq in Europe. In my PC it list multiple settings for NTSC and PAL. PAL settings that are optional on my drivers are as follows:

1080p40 PAL, 1080p25PAL.... Your TV may be PAL... Can you try this out?

Can you confirm that your TV have DVI input. I have a hard time understanding the data sheet from web. Its non English but there is no DVI base on the datasheet that i got. Please re-confirm.

Do you have another Monitor or TV to try out your PC?

What driver version are you using? Have you use NTUNE to fine tune your setup?
August 2, 2009 4:33:18 AM

I would suggest that you play with the NVIDIA Control Panel. Some settings to look at:

Digital Color Format: make sure it is RGB and not YCbCr crap.
Resizing the HDTV desktop: Make sure it is not scaling or whatever.

You might also run through the Windows wizard for the ClearType settings.

These are all things that I had to fix on my ATI card and now my display looks great.

If this doesnt work I would also play with the TV's color/contrast/brightness settings (manually change each, don't be stuck with the presets). I'm doubtful that its the cable because its a DIGITAL signal. If anything is getting through then it is perfect.
August 3, 2009 1:59:55 PM

leon2006 said:
Can you verify which setup works:

PC TV
DVI DVI @ 1080P@60HZ
1080P@24HZ
1080 30HZ
HDMI HDMI @ 1080P @ 60HZ
1080P @ 24HZ
1080 30HZ
DVI-HDMI HDMI @1080P @60HZ
1080P @ 24HZ
1080 @ 30HZ

I'm not sure of the scan freq in Europe. In my PC it list multiple settings for NTSC and PAL. PAL settings that are optional on my drivers are as follows:

1080p40 PAL, 1080p25PAL.... Your TV may be PAL... Can you try this out?

Can you confirm that your TV have DVI input. I have a hard time understanding the data sheet from web. Its non English but there is no DVI base on the datasheet that i got. Please re-confirm.

Do you have another Monitor or TV to try out your PC?

What driver version are you using? Have you use NTUNE to fine tune your setup?


Thanks for the answer.

It seems to work with all the resolutions you wrote down.

Regarding the DVI input on my tv, it's not a DVI. It's a VGA input. So the signal is a DVI To VGA. But if it's the same signal, coming from the DVI input from my GPU as the HDMI, then it's something I can live with. The sound doesn't matter at all, since I have a sorround stereo connected to the computer.

I'm afraid that I do not have another monitor or TVt o try out, but I tried with another pc. It was a laptop with a HDMI output. The same problem occured. Low picture quality, with the 1920x1080 resolution.


I'm using the newest driver from Nvidia to GFX 295, but I do not have NTUNE to fine tune my setup. Do you think that can do the trick?

To Katsushiro.

The color format has been set to RGB. Did you use the settings on your tv or onl your GPU driver? Because the setup provided by Nvidia isn't much of a help. For instance, when trying to change the color in the settings for my tv, absolutely nothing happens. So those configurations aren't to much help :) 

August 3, 2009 9:10:57 PM

santais said:
Thanks for the answer.

It seems to work with all the resolutions you wrote down.

Regarding the DVI input on my tv, it's not a DVI. It's a VGA input. So the signal is a DVI To VGA. But if it's the same signal, coming from the DVI input from my GPU as the HDMI, then it's something I can live with.


WTF???

Dude, you're talking in circles. Scratch HDMI and DVI, is your TV using a 15-pin Sub-D connector (VGA)? You wouldn't use DVI cable for that, you'd use a VGA cable. You would put the adapter block (DVI-I to VGA breakout) on the video card side, not the TV side. And you'd have to set your frequency correctly if the PC doesn't detect it correctly.
August 4, 2009 8:04:36 PM

Yes the input on my tv is a 15-pin Sub-D Connector. Correct me if I'm wrong (which I'm most likely am), then the signal from my DVI output on the GPU is a digital signal. But since the other end of the cable is VGA, the digital is converted to a analog signal. Then inside the tv is a digital converter, which convert the analog signal again to digital signal? So perhaps I could use a DVI to VGA cable?

When I tried using the VGA cable, it automatically set the frequency to 50hz and resolution at 1920x1080, which is the exact configuration it should have. But I get a whole other picture, when using the HDMI port.

I also got an answer from Samsung. They told me that I could be my GPU, or it isn't the correct signal being sent. As mentioned earlier, I have tried with another computer and also with another HDMI cable, so that could impossible be the problem. What's the odds of 2 computer and 2 HDMI cables are not functional.
August 5, 2009 1:57:15 AM

Scratch the VGA connection all-together. In your original post you said you were using an HDMI cable on both ends; why would you deviate from this?
August 5, 2009 8:24:01 AM

Katsushiro said:
Scratch the VGA connection all-together. In your original post you said you were using an HDMI cable on both ends; why would you deviate from this?


Tried pretty much everything you guys suggested in here and it just dont seem to work. So if it works the other way around, with the same picture quality, then it would fine aswell. But of course it would make it a lot easier if the HDMI would work :) 

August 7, 2009 1:58:11 PM

I have exactly the same problem but i have an ATI card and TV is Samsung LE40A656
August 8, 2009 10:24:06 PM

I am also experiencing the exact same problem. I have tried it on 3 different hdtv's all 42"( one samsung and 2 lg's) and 2 different ATI cards a new 4850 and new 4850x2 in 2 completely different computers that were new. I made sure all the setting were at 1980x 1080 both on the computers and the tv's are all 1080P. On all of the tv's they have VGA ports and 3 or 4 hdmi ports. The hdmi works wonderful when using an Xbox for HD quality but when I try to use the DVI to HDMI adapter with the cards it gives me a absolutely horrible image no matter what I change or do. On all 3 of the tv's and both computers with different cards the VGA port (if I use a dvi to vga adapter) works perfectly and the screens are perfectly clear. The only reason I am interested in using the HDMI is I am hoping for a better image but I am not sure if it can even get any better then the VGA one because it looks about perfect. I figure there is just some setting I am missing somewhere on all of these instances that needs to be changed in order for it to come out atleast as good as the VGA is. I have read about 12 forums now of people with these problems and people that have no problem at all and still no one has yet to give a definate answer.
August 20, 2009 8:35:48 PM

I had same kind of problem with LE32B535. With DVI-HDMI -> HDMI connection the picture was bad at 1080p and 720p resolutions.

What fixed it for me was going to the source- menu and from there going to tools and further on to Edit Name. Over there I changed the "name" of the HDMI connection I was using to DVI PC.

Not sure if it does anything with straight HDMI-HDMI connection, but I hope it helps someone.
August 28, 2009 9:35:22 AM

I just bought a new Samsung P2370HD.
My PC has 2 DVI so I used one of them to connect to the monitor using a DVI-DVI cable. Perfect cristal!

Meanwhile I also have a laptop with an HDMI connection and when I use an HDMI-HDMI cable to connect to the monitor the image is horrible!

I don´t understand why, but I did what you said Editing the name to DVI PC and the image quality is much better! Not perfect yet. Stangely I get some black borders all around the screen, so the image is a little compressed because I'm sending 1920x1080 from the laptop, so I should have full screen.

I still made another test to try if was something wrong with the signal sent by the HDMI laptop...
Using the same HDMI-HDMI cable than before I put a smal HDMI-DVI converter in one side and connected this to the DVI input of the monitor. Again... perfect cristal!

Conclusion: there is something wrong with the way the monitor receives the HDMI signal. I will try to talk with Samsung Support.
September 25, 2009 11:29:57 PM

ARe you sure all the video components are HDCP compliant? I have read that if any of your video path does not support HDCP, then the quality is reduced by 70%.
September 27, 2009 3:51:09 PM

cruzj said:
I don´t understand why, but I did what you said Editing the name to DVI PC and the image quality is much better! Not perfect yet. Stangely I get some black borders all around the screen, so the image is a little compressed because I'm sending 1920x1080 from the laptop, so I should have full screen.


Try going to ATI's Catalyst Control Center and from there to your TV's settings -> Scaling Options and pushing the slider to 0%.
Anonymous
October 12, 2009 12:58:41 AM

I am getting the same issue...

DVI out to VGA -- Analog signal I get good resolution and full screen at 1920x1080

DVI out to HDMI -- Much better color & contrast, but 1/2" black borders all around the screen. Because the resolution is being scaled smaller in every direction and not matching 1:1 with the display resolution text is painful to read. While photos and video looks great because of the contrast, I can't use it. TV is displaying 1920x1080 @ 60Hz (though it can do 120).

Video card is an HD 4850 1GB
All cables are monster cable
DVI to HDMI adapter is the one shipped with the video card

Anonymous
October 20, 2009 6:42:02 PM

Im using a Gateway FX 7805u gaming laptop with hdmi out .. connected to a plasma 42" screen hdmi-hdmi and looked crisp and perfect but when connecting

to a Samsung Projection DVI like this > laptop>hdmi-dvi adapter>dvi on projcetion tv i get poor quality video.

tried every setting on both pc and tv and reconfigured everything I cant read the text unless I zoom in far. tried 1080/720/480? p (progressive) its horrible.

sorry i was no help please respond to my post if you have same setup

October 21, 2009 7:44:55 AM

I have the same problem but im useing a Westinghouse W32001 20-Inch LCD i connected it to my pc and the quality is horrible the font is hard to read. Please let me know how i can fix this!
Im currently using a NVIDA GeForce FX 5200 video card.
Thanks
October 25, 2009 12:40:12 AM

for my ATI 4850 card i had trouble with video and movies, then i installed the software from AMD site called Avivo Video Converter (same place as where you get your CCC and driver, scroll down), and then it worked real well. Apparently it kicks the processing from the CPU to the GPU for video.
October 25, 2009 9:43:35 PM

"What fixed it for me was going to the source- menu and from there going to tools and further on to Edit Name. Over there I changed the "name" of the HDMI connection I was using to DVI PC."

This seemed to work for me. Samsung UN-32B6000 with DVI-HDMI cable.
November 6, 2009 1:11:01 PM

I also am having same problem. I have a samsung Full HD 40", which runs sweet at 1920x1080 with my ati 3800 with games or whatever, but as soon as i connect via DVI adapter from GPU to HDMI on TV my desktop is an "eye raper". When gaming i dont seem to notice any loss in picture.

I also have tried everything. I am guess samsung dont want to put there TV down, but i think it could be our tv. My house mate uses HDMI to his 32" panasonic and at first had bad qaulity but he fixed it using options on the tv which i dont have.

I may well be wrong but until i can find a solution i am going back to using the VGA cable with the DVI adapter to the GPU.


Good luck
November 18, 2009 4:08:32 PM

Quote:
I am getting the same issue...

DVI out to VGA -- Analog signal I get good resolution and full screen at 1920x1080

DVI out to HDMI -- Much better color & contrast, but 1/2" black borders all around the screen. Because the resolution is being scaled smaller in every direction and not matching 1:1 with the display resolution text is painful to read. While photos and video looks great because of the contrast, I can't use it. TV is displaying 1920x1080 @ 60Hz (though it can do 120).

Video card is an HD 4850 1GB
All cables are monster cable
DVI to HDMI adapter is the one shipped with the video card


SOLUTION:
The problem with black borders is solved in the settings for the graphics card. Look for the overscan setting and play with it and you're more than likely to solve that problem in less than one minute!

// BoJaKa
November 18, 2009 8:08:08 PM

santais said:
Hey Guys.

So I just got myself a new 40" Samsung LCD tv and want to plug it into my pc, so I'm able to watch blue ray movies in full HD on the big screen. But as usually it comes with problems. As the topic says, I'm having extremly low quality on the tv.

I'm usting a HDMI to HDMI cable. My videocard is a Geforce GTX 295, with 2x DVI ports and 1x HDMI port.
My television has 3x HDMI outputs and a DVI.

So I tried pretty much everything. Been reading several topics, where people had the exact same problem, but unfortianely none of those few answers have worked for me yet.

The picture is very distorted and the colors are flattened a lot. The text itself is eye raper, but readable. Around all the text, is some sort of white marking. No matter what resolution I set it at, it remains and so do the bad quality.

My tv is a Samsung LE44B535 Full HD 1080p.

I hope that there is some of you that know what could occur this problem, because otherwise it would be a bit of a waste of money :) 

On beforehand, thanks.



PROBLEM 2 SOLVED!!!

HDTV text with white outline... No more...

Set Picture mode to Dynamic on your TV. Turn Sharpness way down to ZERO and set HDMI Black Level to Normal/Default. Backlight to 5 or so and the white marking is history!!!

Best regards // Kallamamran
December 9, 2009 8:08:51 PM

RE: Kallamamran

He is absolutely correct.

The determinant factor was the ability pass-thru you signal - on a LG TV this is set in the Picture Aspect Ration, which implicitly we set to 16:9. However, this still scales the picture and give you fuzzy unclear text. It oversamples the picture.

If you are using a PC, use "Scan Thru" "Pass Thru" setting on the TV. This will not up/down convert the incoming HDMI signal.

The Sharpeness control is critical for clarity of text. You'll see bleeding of text and over saturation of text and mouse cursor. As suggested turn down Sharpness to meet you clarity needs.


Mark Pahulje
December 14, 2009 2:30:15 AM

I had the exact same problem with my 37" samsung Series 5 1080p HDTV. Here is how I solved it!!!!!!!

While the TV is in the HDMI input go into the TV's menu select "input" ---->"Edit Name"--->set the HDMI1/DVI mode to PC. This instantly made my TV screen as good as a PC monitor!

hope this Helps
December 14, 2009 3:03:56 AM

Quote:
I am getting the same issue...

DVI out to VGA -- Analog signal I get good resolution and full screen at 1920x1080

DVI out to HDMI -- Much better color & contrast, but 1/2" black borders all around the screen. Because the resolution is being scaled smaller in every direction and not matching 1:1 with the display resolution text is painful to read. While photos and video looks great because of the contrast, I can't use it. TV is displaying 1920x1080 @ 60Hz (though it can do 120).

Video card is an HD 4850 1GB
All cables are monster cable
DVI to HDMI adapter is the one shipped with the video card


Hey buddy have you solved this yet? I have the same problem. Could you please post how you did it?
January 1, 2010 10:33:07 AM

Well, I'm not the only one.... Okay, here is some information supporting the claim that this is a problem with the display or monitor (Specifically Samsung).

The following is what I used for testing....

Two test monitors
1) Samsung 52-inch LCD (Model # LN-T5265F)
2) Samsung 23-inch monitor/TV (model # P2370HD)

Two video cards
1) XFX ATI Radeon HD 5770 (Two DVI connectors, and one HDMI)
2) EVGA 730i Motherboard with onboard nvidia Video (one DVI, one HDMI, and one VGA)

Three HDMI cables
1) Monster
2) NGEAR
3) The one shipped with my Playstation 3

Both Video cards produce the exact same consistent results on both Displays/Monitors with each HDMI cable. With a direct HDMI to HDMI connection from the video card to the display, the image quality is very poor. In fact, the quality is worse than I've ever seen. The VGA connection even looks better. The DVI connection is perfect (You could even say beautiful...:p ). Everything seems to be distorted using HDMI. This is the case with all supported display modes.

On my ATI card, I was able to get rid of the black border, by adjusting the overscan percentage to zero/0 (catalyst control center-> "Graphics" menu -> "Desktops and Displays" -> right-click your display at the bottom of the dialog and click "configure" .

The nvidia scaling options are disabled when the resolution is set the native resolution of the display. Therefore, there were no black borders. (Actually it's seems to be disabled permanently, but that's another issue....)

Either way, the scaling options do nothing for the image quality. By the way, "Menu"->"Input"->"Edit Name" and changing to DVI-PC does NOT fix the problem. The brightness and vibrance diminishes drastically and this can NOT be corrected by adjusting the Display settings.... neither dynamic, normal, movie, brightness, contrast, black level...etc. You will also notice that the video you're seeing is not 1920x1080 like you might think according to your displays settings. I'll elaborate more...

In another experiment, I had each of the two computers connected simultaneously to the same display. The resolution set at 1920x1080 32-bit, true color on both cards. On the DVI connection which was perfect.. icons, text, and the toolbar were displayed at a smaller physical size then the HDMI connection. The actual size being displayed is not 1920x1080, even though that's what the display setting is set to. The size of the text, icons, and toolbar, etc.. looked like 1680x1050. I say that because when I reduce the resolution on the DVI connection to 1680x1050 the physical size of the icons, text, and toolbar seem to be the same physical size.

It seems to me that, ....keep in mind I'm no expert.....the monitor/display is taking the HDMI 1920x1080 signal, shrinking it to some lower resolution, and then stretching it back out to fill the screen. I believe this is what's causing the distortion/poor image quality. I know that sounds pretty crazy, but that's what it seems like to me. Seems like there's a bug in the software of the TV/Display. Also, consider that everything is controlled by software, and there are always bugs in software. I think Samsung needs to hire some new programmers or some better testers....and while they're at it retrain their tech support. This is insane! It could be the GPUs, but I doubt it...

The "scan through" or "pass through" setting that Markus_Hooge mentioned sounds like something that might work, but unfortunately, neither my Samsung TV or monitor seem to have that option....:( 

I think I read in another forum this wasn't a problem under Windows XP, can anyone confirm this? I haven't had a chance to try with XP yet. Not that I would be interested in running XP on my HTPC....

PLEASE PEOPLE, READ THROUGH THE FORUM BEFORE POSTING GENERIC SUGGESTIONS THAT HAVE ALREADY BEEN SUGGESTED. THIS DOESN'T HELP... SORRY IF I SOUND A LITTLE HARSH....

CHEERS!
January 3, 2010 10:01:33 AM

I have the same *%#@&#!!! problem , HDMI > HDMI, DVI > HDMI = poor quality , DVI > VGA = beautiful

TV:
Samsung LN46A650 46-Inch 1080p 120 Hz LCD HDTV

PC:
- EVGA X58 SLI Classified Intel X58 Chipset SLI/CrossFireX DDR3 Mainboard
-12GB (2GBx6) DDR3/1800MHz Triple Channel Memory Module
- ATI Radeon HD 5870 PCI-E 16X 1GB DDR5 Video Card (DirectX 11 Support)
...

... I want to see if HDMI can give better image quality (color, contrast, resolution, etc) but seems that its not way to go with Samsung TV's.

The question is: if anyone have all connections (HDMI>HDMI, DVI>HDMI, DVI>VGA) working perfectly, which one is the best? , so I can find another HDTV that works fine.
or keep samsung and VGA connection.

PD sorry if my english grammar sucks :) 

Thanks!
Anonymous
January 12, 2010 7:16:21 AM

After hours and playing around with ATI/TV settings, Ive solved this problem, using a HDMI>HDMI lead from ATI card to Samsung TV


The manual stats that if you plug a PC into the TV, you MUST use HDMI/DVI s,lot 3. So i did this.
I then reset all my ATI card settings to default, which detects a 1920 x 1080 60hz TV
I then went to the TV settings and changed the name of HDMI/DVI 3 to "PC" from "---"

thats it, after hours of fiddling.. sorted!

Anonymous
January 21, 2010 7:04:01 PM

Hey guys....Had the same issue here on samsung 2333HD but its nothing but an easy fix...
u just press the source button...then tools and name your HDMI\DVI device to PC or DVI PC.
then your desktop should be just fine..and fit perfectly in your screen...
of course u can always adjust the screensize by screen options...but whateva...;)
anyways...hope that helps ;)  :D 
Anonymous
January 24, 2010 1:58:31 PM

Hi Guys,

Just encountered the same problem here, and wanted to say thanks because the advice posted here got it sorted for me, also thought I would share my own experience in case it helps anyone.

I have a 40" Samsung full HD tv, working fine at 1080p with my 360 and producing a great 1920*1080 display when plugged into my dell laptop using a VGA cable.

Anyway, I got a new Dell laptop with an ATI Graphics card and HDMI output. So I set the laptop next to the TV, stuck the hdmi cable into the laptop and then into the hdmi input on the side of the tv (HDMI4 in this instance)

Not what I expected to see, the display was awful, and had a black border round the side of it. So did a bit of googling and came across this page, tried some suggestions but no joy until I got to the last few posts.

I unplugged the HDMI cable from HDMI4, plugged it into HDMI3 - Still awful, went to edit the name of the input, and as soon as I scrolled down to "PC", Bingo - Immediate improvement, although the black bars around the side became even bigger ! Went into the graphics properties of the ATI card and changed the overscan to 0% - Fits perfectly - just watched an HD video and it looks amazing.

Thanks again for the help !!!!!!!
January 28, 2010 5:11:03 AM

Quote:
went to edit the name of the input


Where do you edit the name of the input? I'm using catalyst 9.9 and I don't see that option anywhere. Care to upload a screenshot for me if possible? Thanks :) 
Anonymous
February 4, 2010 6:24:50 AM

by using ati control panel, right click the bottom icon represent the lcd tv, click configure... under scaling, set to 0% and tick the radio button below it. it should work
February 26, 2010 12:34:59 AM

chapazzo said:
Quote:
went to edit the name of the input


Where do you edit the name of the input? I'm using catalyst 9.9 and I don't see that option anywhere. Care to upload a screenshot for me if possible? Thanks :) 


The name he is editing is on the monitor/hdtv.

FWIW it solved my issue (combined with turning the scaling off) .

Thanks to all!
March 4, 2010 1:30:46 PM

Hey All -

Spent alot of time VERY frustrated with this same problem. Here is my story and what eventually worked for me.

I started with a Radeon 48xx card, and I could get no good results with anything but DVI -> VGA and using the VGA connection on the TV. With some tuning (which the Samsung menu provides quite a bit of), the screen was perfectly usable. Not totally as sharp as a digital connection, but really quite good. I was able to use the monitor for several months at 1920x1080 VGA.

I recently upgraded to a Radeon 58xx card and decided I would make another run at configuring the screen. I tried pretty much all of the suggestions in this thread and nothing worked. Here is what did work, for me.

I used a DVI -> HDMI cable and connected it to the HDMI1 input. The port used IS important. What clued me in was the default label of the HDMI1 port being "HDMI1/DVI". Once I did this, I adjusted the overscan in the CCC to make sure the picture filled the screen and, like magic, the text is crystal clear. It was a total pain, because my screen is mounted on the wall, but necessary, none of the other ports yielded good results.

The only other tuning I did was to turn down the backlight.

My picture is now "perfect".

Good luck all.
Anonymous
March 14, 2010 6:14:30 PM

Didnt read the whole thread but had the same problem. And it cost me some research to find out..

Im posting it here because it was the first one on google.

Connect your PC with HDMI to your screen. In the menu where you choose the channel (exmpl. HDMI 1) you select change name en choose DVI PC or something... Now it works !!

Hope to help some...

Greeting from the Netherlands
!