Sign in with
Sign up | Sign in
Your question

PC to TV Connection problem:

Last response: in Graphics & Displays
Share
July 17, 2010 9:28:13 PM

I've been using my laptop, a Linux box and another Windows XP computer for over a year to watch things on my TV and to even just use my Samsung HDTV as a monitor at times. It's worked a treat.

I've recently decided to hook up the TV to my other computer. Here is the problem;

I'm connecting the PC to the TV via a VGA cable, the same one that I have been using to connect the other computers, so the cable is fine.

The TV picks up the PC loading screens (Manufacturer info screen, Windows XP loading screen, etc), however once the PC boots into the desktop, the TV doesn't show anything and it says "Mode not supported".

Now as far as I understand, from my previous issues with another computer, this usually means that the resolution the computer is set to is not supported by the TV.

How can I set the resolution to automatically adjust to the TVs resolution or at least to a resolution that is suitable for my TV and my needs? The TV is 40 inch LCD HDTV (1080p). The video card is an ATI Radeon HD 3450.

Any other ideas or advice to solve my problem are greatly appreciated in advance. Thank you.

More about : connection problem

July 17, 2010 9:28:38 PM

Alright, so I've messed around with Catalyst Control Center and things are working now. However, the resolutions that are available aren't good. 1600x1200 is nice, but the text is to stretched, a bit blurry and seems not focused.

The other usual resolutions (1280x1024, 1360x768, etc) work but again, they make everything too big and it feels like I'm on 800 x 600.

I've just remembered that I have an HDMI cable which I use with my DVD player and so I am now trying to hook up the computer to the LCD TV via that. The computer has an HDMI port and so does the TV.

I've hooked up both ends to their respective ports, rebooted the computer a few times, however when I begin going through the "source" options on my TV, and get to HDMI, I'm not having any luck. I should mention that the HDMI port on the PC is not on the video card but by itself.

It says "Searching for signals" and after several seconds, "No Signal".

Any help? Please and thank you.
a c 153 U Graphics card
July 17, 2010 9:30:13 PM

EDIT - I see you got it working.

Related resources
July 17, 2010 9:32:27 PM

I'm still having issues as I cannot seem to figure out if it does or does not support PC HDMI nput.

What I've decided to try is buy a DVI-to-HDMI adapter and plug into my PC. Then, hook up the HDMI cable from the converter to one of the HDMI ports on the HDTV. I would be able to use the 1920 x 1080 resolution via this method, right? The HDTV supports that resolution as it's a full HDTV, and so does the video card (it has a DVI port).
a c 153 U Graphics card
July 17, 2010 9:41:49 PM

I have a 1080P Samsung TV and even with a DVI (pc) to VGA (tv) connector I can get 1920x1080 resolution. I don't think switching to a HDMI cable will change anything. Maybe its just a setting issue, whether it be in the computer or the TV.
July 17, 2010 9:45:05 PM

So it's possible to get 1920 x 1080 resolution via a VGA hook up from the PC to the HDTV? Because I've tried that, that was the first thing I tried and I couldn't get it to work. Through the CCC, it wouldn't allow me, and when I did it via the Display Settings window (right click on desktop, etc), I would get the "no signal" error message.
a c 153 U Graphics card
July 17, 2010 9:54:38 PM

Yup I just checked, my other display is running as 1920 x 1080
July 17, 2010 10:02:29 PM

And it's not blurry or smudged at some parts of the screen?

Then it should be possible to do it with my Samsung HDTV as well. As I said previously, I tried setting the resolution via 2 different ways, and both ways wouldn't work, resulting in the "Mode Not Supported" error message.

Edit: I also have another, separate issue.

My montior's max resolution is 1440x900, which I use. On one of the HDDs which is running Vista, it used to allow me to set the refresh rate at 75Hz. It was working fine. Now, several months ago I installed a 2nd HDD to use XP Pro, and I could do the same there. However I've just gone back to the 1st HDD, to use Vista, and I cannot set the refresh rate to 75Hz anymore. Both HDDs are plugged in still and I just boot into either whenever I need to for whatever reason. I can set it to that on other resolutions, lower ones, but not 1440 x 900. And so the result is, fuzzy parts on the monitor.

Any help on this issue as well?
a c 153 U Graphics card
July 17, 2010 10:13:41 PM

No, its very crisp as that resoltuion.

Ill have to look into the other issue.
a b U Graphics card
July 18, 2010 9:16:29 AM

Most Tvs dont go higher then 1366x768 on the VGA, some do, alot do not, Id say yours does not. Read the manual for the TV, get a dvi to hdmi cable. The HDMI on your motherboard is probably disabled since you have a dedicated card.
a b U Graphics card
July 18, 2010 2:57:29 PM

get hdmi
July 18, 2010 3:40:31 PM

daship said:
Most Tvs dont go higher then 1366x768 on the VGA, some do, alot do not, Id say yours does not. Read the manual for the TV, get a dvi to hdmi cable. The HDMI on your motherboard is probably disabled since you have a dedicated card.


The highest I've been able to set the TV resolution to through VGA is 1600x1200. I guess the only option left is to buy a DVI to HDMI cable or the DVI adapter and hook up the HDMI cable to that.

As for the dedicated HDMI port being disabled, is there a way of finding out if that's true or if I can activate it somehow? Otherwise it's just a waste sitting there.
July 19, 2010 10:03:26 AM

So I've read the manual, and it references several times about setting up the TV to be used as a monitor with a PC via VGA, HDMI, DVI etc, and how to set the resolution settings on the PC, which port to plug it into, even gives a table with the resolutions supported by each port, etc. Thus, before purchasing the DVI to HDMI adapter, I'm going to try one last time, using the VGA cable to get the native 1080p resolution and then the HDMI cable through the separate HDMI port.

After a small google search, it turns out that quite a bit of people are having the same issue as me with the HDMI port. Some are suggesting that it is disabled since there is a dedicated graphics card installed? Can this really be the case?

I would really like to activate/enable the HDMI port since it would make things a lot easier. Any suggestions?
July 19, 2010 11:34:13 AM

Ok, after trying one last time, here are the results; it works. I can get 1920 x 1080 @ 60Hz using my VGA cable. However what's surprising is that it works only if I boot into the HDD that has Vista. It won't work on the HDD with XP Pro SP3. I thought this was possible with XP Pro as well?

Now, I'm wondering whether I should buy the DVI to HDMI adapter as I do plan on watching movies/TV shows with this setup. Will I notice a significant difference?

As for the HDMI route, I can't seem to get it to work. I've done everything listed in the manual (renaming HDMI IN 2 to PC, etc.). I guess that option is out of the window, which sucks, since it's just a useless HDMI port on my motherboard that I can't use.

a c 140 U Graphics card
a b x TV
July 19, 2010 3:19:39 PM

Your on-board HDMI port is probably disabled when a discrete graphics card is installed. Try removing the graphics card to re-enable your on-board HDMI port and connect to the HDTV using that.

-Wolf sends
!