I'm getting no signal on my screen when connected using DVI to hdmi cable. The HDMI to HDMI works fine. I am using a 32" LG 1080p LCD as my monitor with 2 hdmi connection and 1 vga. Im running windows 7 ultimate 64.
I was using the onboard graphics before I received the card (vga), then I brought in my hdmi cable from home until my cable arrived. Hooked up dvi from card to hdmi to TV and absolutely nothing. The TV didn't even recognize the connection which it usually does when any new input is being used.
Oh and one more question. The CCC doesn't have the option to change screen position/size as the hdmi connection leaves a black border around the left side and top of screen. Why is this no longer available. It seems like every CCC in the past has this option.
Could be an HDCP issue. Win7 64bit is going to transmit with HDCP, if the TV can't decode the content it can't display it. Maybe something is lost in the DVI to HDMI cable that you have. The Cable you have IS HDCP compatible, but it doesn't mean it will work with the devices you have. That is my best guess.
As far as the borders, I have the same issue when i connect my PC to my TV via HDMI. It is set to a 30Hz refresh rate when I plug it in for some reason. Go into the display settings and make sure it is set to 1920x1080 and 60HZ.
Generally the HDMI as well as DVI-2 are used for secondary monitors as they are not active until the O/S loads the drivers.
If you are sure you are using DVI-1 and you are still not getting a display, try rebooting into safe mode and see what does.
Is the 32" LG being used as a secondary display or is it your only display?
Primary display and are you speaking of the cable being dvi-1 / -2 or the output on the card. I get so confused w/ all the different types out there. But it seems like I read that the output is dvi-1. I'll let you know when I get to work.
yeah, actually I looked at your card again and I don't know where I got the idea it had two DVI ports from....thats my mistake as I just looked again and I am pretty sure your card has only a single DVI port....with one HDMI and one VGA.
If your card does had two DVI ports though, be sure to try them both.
Like jay was saying, it could be an HDCP issue.....something not making it through that adapter cable....pretty sure the DVI port is HDCP and we know the monitor is as it works through the HDMI cable.....that really only leaves the adapter....
Thanks to the both of you for helping, I have an old 2600xt at home with dual monitor (22in 1080p and 42in lcd 1080p as secondary). It has dual dvi and im running dvi to dvi for monitor and dvi to hdmi for TV. Everything runs flawlessly on this setup. I was just really surprised to get in here and put the cable to it and it just not work. I've used ati all my life with no fuss but then again this is my first gigabyte card. I'm going try to switch the cable from my home setup and see if that does anything. If not i'll switch my cards out and see if it is the TV. If not I'll probably just return it and go with something else. I'll let ya know how the cable switch goes.
Good luck with that. So far I have been pretty lucky when it comes to monitors/graphic cards/DVI/HDMI etc. so beyond my initial research into my purchases I haven't had to dig too deep into the inner workings.
Guys, finally switched my cables out and it is just the cable. The fact is the cable I bought is bad. So no worries, next time I'll do my research before getting all worked up. And also the border is an issue because the tv is only 720p, but allows for higher resolutions. Like I said before thanks for all the input.