I don't understand this behaviour about HDCP: I have Windows 7 installed and a Mini Displayport to VGA adapter. When I connect my laptop to my LCD through VGA, it only gives me 640x480 resolution. Looking the nvidia panel, it tells me that my display doesn't support HDCP. I have two questions about this:
1. Isn't HDCP for protected content and for conections that support it(HDMI, DVI, etc)?? why it doesn't let me show with normal resolution even just the Windows desktop?? and why it demands HDCP through a connection (VGA) that doesn't support HDCP?
2. The more strange thing is that if I connect the same computer with the same adapter but to an old CRT monitor, I can have 1280x1024 without problems...
Can somebody explain me why this happens?? Is it possible to solve?
r_manic, that was the first thing I tried. The only resolution available is 640x480. As I said, Nvidia panel tells me the problem is incompatibility with HDCP, but I don't know why this could be a problem in a VGA connection...
Actually I've been wondering about that myself. I'm assuming your drivers for the graphics card are updated?
Yes, I have an updated driver. The thing that confuses me most is that in a CRT monitor it works as normal... only when connected to the LCD (through the same mini displayport to VGA adapter) it gives me that resolution (and the message in the Nvidia panel saying this display doesn't support HDCP :S).
A HDCP signal has to be digital from start to finish, you are going from digital to analogue and analogue is not HDCP compliant.
ok, the signal starts as digital from the mini-displayport and it is transformed to analog by the mini-display port to vga adapter so, as you said, this case cannot be a HDCP signal. If that is the case, why Windows 7 requires HDCP to connect the LCD through VGA? and why that happens only with the LCD and not the CRT monitor?
I have a machine here that is running W7 and is connected to an LCD monitor via VGA, so I'm not really sure that I understand your question.
I think the problem is because of how i connect the LCD through VGA. I assume you have a VGA port on your graphic card, but in my case (a Macbook) I don't have a direct VGA port but a mini-displayport (which is, like you said, digital). So, I use a mini-displayport to VGA adapter to connect monitors or other displays through VGA.
The problem arises when combining Windows 7, this adapter and the LCD... within Mac OS I don't have this problem, and in Windows 7 but with a CRT monitor, I don't have it either. As I said, the Nvidia panel tells that my "display doesn't support HDCP", and I don't know why it tells me that ONLY with the LCD and even when I connect the LCD through an analog signal :S ...
It's got to be the DP adapter then, chances are you would not have the problem if that lappy had a VGA or DVI port.
I would be unconcerned if it were because of that (not happy, but at least not intrigued). Nevertheless, this only happens with the LCD, not the CRT, and that's is what intrigued me most. If it is because of the DP adapter, then it doesn't make sense to work with some display and not with the other.
A CRT is analogue only and an LCD can be either depending on port used.
Yes, but I connected through VGA port which is analogue as i understand. Is it possible that my LCD could sent back to my laptop, through a VGA connection, some information that makes Windows know my LCD has digital ports also? Even in that case, I don't understand why Windows blocks the resolution on the LCD if it is connected through an analog VGA connection.