I have a 32" Sony Bravia TV and a pc with an ATI Radeon HD 5700 series graphics card. I've connected them using a 12M HDMI cable but despite having gone through all the resolution settings, I am unable to get the two devices to cooperate. I get the PC recognising there are multiple displays and the TV displaying a black screen.
The PC detects the Sony display and the TV display indicates that there is an input into the correct HDMI port. I'm at a loss as to what to do next.
Are you using the Radeon's hdmi port (Incase the computer has multiple hdmi ports) and choosing the correct hdmi channel on the tv?
In ccc have you set the tv to duplicate or only display? Duplicate to test with so you don’t end up with a blank screen if there’s something else wrong, albeit this is a setting you might have to set to work.
The resolution you'll be using should be native to the tv. Google if you’re not sure for your tv’s manual on setting the correct resolution and hz.
Yes I'm using the HDMI port on the card - I've tried both of them and they both appear to work (in so much as they recognise they are connected to a Sony monitor). I am also choosing the correct input on the TV itself.
I have chosen the native resolution for the TV (which should be 1080 for this model) but have also experimented with different resolutions with no success. As for duplicating the display, I selected this option in the "screen resolution" window in the control panel as I couldn't locate the option in CCC. This too had no effect.
The setup is reporting and acting like there's a 2nd screen attached (for example the mouse can travel far off screen in "extend these displays" mode) but I'm getting zilch on the TV beyond an icon confirming an input in the source selection menu.
Could it be a TV software issue?
This is bizarre. There’s nothing really in it when connecting tvs, especially your tv and in general any tv nowadays.
In control panel > Display > Screen resolution > Display: It lists the tv and you’ve clicked on it and applied?
CCC may help your problem but i can’t see how just setting it the way you have been trying to wont work. The two computers I have, one with a 5870 and this one with gtx580 are both using TVs and set the same way you're trying to, didn’t need to use Nvidia’s or Catalyst Control Centre.
Unless by mistake, the hdmi you’re connecting to on the tv is hdmi out ?
Definitely connected to the HDMI IN ports
You can see from this image (http://imgur.com/QFncV) that the pc is recognising the TV - it even makes the "bo-doop" noise when I disconnect the HDMI cable from the TV)
I'm using windows 7 and the firmware on the TV is up to date. All my ATI software and drivers are current too.
Its possible, depending on the quality. I know of people having 15m hdmi cable used for their projectors and is fine.
Having a look around the net its recommended to keep within in 5 meters with cheaply made cables otherwise the signal degrades and causes dropouts unless a signal booster is used.
I responded to someone here using 30m worth of hdmi cable and his picture would intermittently come and go, so in his case the length was still working but was obviously in need of a booster, albeit he still had some sort of picture on the screen.
If you can, borrow (test) or buy another lead as it could be broken, which ever decide is best for you.
Having a read here, not sure if symptoms with your tv is related or not but its worth a try.
Do you get sound through the tv if you select the sound playback on your computer to use the tv speakers?
Try put up the volume up too, this seems to have worked for someone in the above forum.
this quote also from there "Im not sure if you are doing this correctly: you have to set the TVdrive to HD / HDMI, then quickly switch your TV over to HDMI and using the remote confirm that you can see the picture - if you don’t do this within a certain period of time it will switch back to the SCART output"
Not sure if that will help, have a bit of a read and probably best to read the manual for the tv aswell if you haven’t already.
Thanks for the additional advice boju.
I connected a laptop to the tv using the same long cable and that worked like a dream. No issues, automatic recognition and screen extension. So it seems that the problem must lie with my graphics card (or other PC component).
Is it possible that the HDMI outs might have been turned off in some way? Perhaps there's a jumper that needs to be moved...?
Youre welcome I don’t have much experience in this area regarding all display ports on the gpu so ive been doing some reading to better understand myself and help you more.
The confusion we had was to do with Eyefinity and the display limitation the 5 series cards have when it consists of 2 dvi & 1 hdmi.
One of the hdmi and dvi ports share an internal clock (basically its just 1 connection with 2 heads on it) so they both cant be used at the same time. Which makes sense now when you connected with the other dvi.
So if you wanted to use 3 monitors on your card an active displayport adaptor is needed.
Here is a list of AMD's Validated Dongles if you ever decide to buy one.
So i guess the two shared ports with dvi being the priority on the same circuit as hdmi doesn’t have enough or is denied power to run both. Connecting an active adaptor gets its power from usb and of it goes.