With my Dell XPS 1080p notebook, I'm accustomed to simply connecting an HDMI cable to a 1080p TV (Mitsubishi 46" LCD) and using it as a monitor. When I built my i5-2500K (ASUS P8Z68-V Pro) PC, I expected to be able to use it the same way, but the PC (WinXP Pro SP3, Win7 x86, and Win7 x64) has never seemed to know when the cable is connected.
I'm using the Radeon HD3000 of the i5-2500K and the Intel(R) HD Graphics Family driver from Intel (version 126.96.36.1995, dated 5/23/2011).
Anybody have any ideas of what I might have to do so the PC will detect when it's connected to the TV?
1) right click desktop
2) click screen resolution
3) If you see 2 screens with a 1 and a 2 in it, click the other display and select "Extend these displays" in the pulldown menu
4) If it does NOT detect the second display, make sure the TV input is on the right one and hit the detect button
sometimes depending on the tv, you cannot select inputs until something is plugged in and active so you need to spam the detect button while changing to the proper input.
Another tip, you can move the 1 and 2 screens by click-and-drag to other sides, so by moving the mouse all the way to the right it goes to the other screen, or all the way to the left.
The VGA and DVI connections work as expected, but they are committed to other devices (I have an older PC that has DVI output only) and that is not my objective here. I've counted on using the HDMI connection interchangeably with the XPS notebook so either of them could act as a HTPC when needed.
Using Windows 7 on the PC, whenever I go to Screen Resolution and select "Detect", it responds with "Another display not detected". I never needed a monitor driver to use the XPS notebook with the TV, so I wouldn't expect one to be needed for the PC either.
I've tried everything I can think of, including attaching the HDMI to my 1080p notebook and then moving the plug to the new PC as quickly as possible. Even when the HDMI to the TV is the only display connected, the PC acts like there is no display at all.
The only other thing I can think of is that this is some kind of driver issue, but I have no idea what driver would work better than the one I'm using.
I experienced the same problem with my i5 2500K built using the integrated graphics & and HDMI TV connection as 2nd display. Even if I keep the HDMI connected between my PC & my HDMI TV, my PC will appear as if there the HDMI is not connected if I turned off the TV then turn it on again. The only way I can recover the HDMI connection so the PC can detect my HDMI TV again is to plug then unplug the HDMI cable from my motherboard's HDMI port. I use MSI Z68A GD55 if this info helps.
Ive been having the very same problem although with different hardware.
I had just built a htpc using a zotac fusion e350 mitx motherboard and tried hooking it up to a sony 46" 1080p lcd using HDMI. Boot shows image but when entering windows the graphicsdrivers kicks in and the screen goes black (no signal). Catalyst cannot detect the TV. Using DVI->HDMI adapter works but in the cramped space I often loose a color due to the cable coming a little loose from being bent hard. That causes the image to go all purple. refitting the connector causes windows to loose the sound, which comes back after loggin out and then back in again. Imagine the pain in the yoohaa when watching an exciting movie! Sent the TV to the repairman (Sony) only to have it back a few weeks later with a note saying it works fine on a sony BD-player. Updating the firmware on the tv didnt work and neither did updating catalyst, windows (vista home basic (just coz I had one laying around)) etc.
Finally I ended up swapping the TV for a 50" plasma 720p which worked like a dream. All I miss is the added resolution and no burn-ins (happens a little when using the plasma as a comp-screen).
Solutions are apreciated.