I don't normally post on forums but this is a question which is rather difficult to google an answer for.
I've just built my own computer recently and it's working beautifully, the only thing I'm having trouble with at the moment is the HDMI display. Now I've considered a lot of the hardware side but the funny thing is that I think it may be a driver or Windows issue.
Today I installed my new graphics card (HD7850) which has a built in HDMI port. Unfortunately my monitor which is a Samsung BX2335 only has sockets for DVI, so as I've done in the past I went out and bought a DVI adapter for the HDMI cable.
When I boot up the computer, the motherboard splash screen flashes as normal and then the Windows 7 logo loads up, but once it gets beyond that point the screen goes black and my monitor begins to flash between Digital/Analog in the top left corner. Before installing my ATI drivers to display decent resolutions I booted into windows using the HDMI port with no problem whatsoever, but once installing I had the same problem as when I attempted to run the HDMI from the onboard graphics on my motherboard. When booting into safe mode I also get no issues running through HDMI.
I've used this same variety of cable and adapter before when hooking my old non-hdmi graphics card to my TV in the past with no problems, just seems like Windows/My monitor don't like the high resolutions through the HDMI port?
Just wondering if I can get any closure on this, I've probably overlooked something obvious.
Well the problem is it can't just be the graphics card, because I'm also using the P8Z77-V Pro Asus motherboard which has it's own HDMI port and I experience the same problem. I actually thought maybe the refresh rate or something was wrong but I set my resolution to the lowest settings and attempted a retry and still a black screen. Seems to be a common problem with the Samsung BX2335 monitor which I got gipped for, honestly thought a modern day LED monitor would have a built in HDMI port already.