HDMI not recognized so no external monitor + (HDCP)

vannoying

Honorable
May 24, 2013
6
0
10,510
This is my system. Since I received it 20 Jan 13, it has been afflicted with small problems and large. I originally contacted them to come out and fix it because I could not push a picture to the external HD monitor by HDMI. First it said the device was non-compliant, then refused to see it as connected at all.
I tried three different HDMI cables, and the monitor has also 3 different HDMI ports, so something ought to work, but nada. Onsite Tech replaced the graphics cards, but could still not make it work. So the replaced the unit.
The second unit was also unable to move a signal out through HDMI, but at least it could see the monitor was connected. Onsite Tech replaced the cards (again) and broke my internal microphone array.
They sent a third unit. It is also incapable of seeing the monitor. But they have also built it incorrectly, so the Onsite tech must come and replace all Hard Drives. I installed the OS to the SSD, and went to nvidia for graphics drivers. The GTX 675 had an updated driver (320.18) issued the day of my service call. So that is the one I used. Suddenly, it works! I am connected by HDMI, everything is registered as HDCP compliant, and I could not be happier.
Only thing I still need to install is the blu-ray codec/stupid bloatware - and test it on the external monitor. But I really didn't want to be disappointed, so I waited until the next day. Choosing, instead, to run around World of Warcraft on a 42" screen and kill stuff. And it was awesome!
Of course, you know what happens next.
I install the Cyberlink PowerDVD and I watch a bluray movie (on laptop screen). Then I connect it to the external monitor and the signal won't transmit. Not only that, but the nVIDIA Network Control Panel now says that the monitor is not HDCP compliant (when it had been the night before).
I rolled back the computer to before the PowerDVD install, but I still cannot get the external monitor back on HDMI.
I talked to nVIDIA, they suspect the HDMI out port on the mobo.
Alienware insists it is the cards, and keeps throwing new cards at the problem.
I just want it to be fixed. Any constructive ideas?
--------------------------------------------------------------------------------
My rig: Alienware Model M18xR2 Total amt of system memory 32.0 GB RAM
System type 64-bit operating system Number of processor cores 4
--------------------------------------------------------------------------------
Storage:
Total size of hard disk(s) 1164 GB
Disk partition (C: ) 93 GB Free (233 GB Total) SSD/250
Media drive (D: ) BD/CD/DVD (no blu-ray software installed ATM)
Disk partition (G: ) 930 GB Free (931 GB Total) Raid 1
--------------------------------------------------------------------------------
Graphics:
Display adapter type NVIDIA GeForce GTX 675M (2) in SLI
Total available graphics memory 4095 MB
Dedicated graphics memory 2048 MB
Dedicated system memory 0 MB
Shared system memory 2047 MB
Display adapter driver version 9.18.13.2018
Primary monitor resolution 1920x1080
DirectX version DirectX 10
--------------------------------------------------------------------------------
Network:
Network Adapter Atheros AR8151 PCI-E Gigabit Ethernet Controller (NDIS 6.20)
Network Adapter Killer Wireless-N 1103 Network Adapter

You made it all the way down here!
Thank you!
I did give you the short version, I promise, and I am grateful for your insight.
 
That seems to be strange.

Just a simple and idiotic question which might solve your problem.

Did you made sure that it is listed as additional display available?
Okay .. easy tutorial -

Right click on desktop --> Screen Resolution --> Display --> See if there is a second display listed there .. If yes the choose it, apply and OK.

Considering it is a Dell system, I think there would the FN keys there. If yes, press FN + F1 (The display switch key) and choose the right display. That works on my laptop and I have not a single problem connecting to my TV through HDMI.