So, I have 2 year old monitor which does not have any kind of DRM (of cause)
Now, if I upgrade my computer to Vista and if I get HD DVD or Blu Ray drive then I won't be able to see HDTV quality movie on my monitor because it is old?? No matter which program I use to play the content?
Hmmm,don't see how that should make any diff,the DRM infection is in the os not your monitor..N'cest pas??
So, are you saying that there is a way to legally play DRM protected content on Blu Ray or HD DVD through DVI port of my 3 year old 51 inch HDTV? In full 1080i glory?
I though the law required to have HDMI port to play that content so that TV itself would work with DRM protected content (how is it called here? HDCP?). Are you saying that having Vista computer can bypass this requirement?
I believe this is the difference - DRM is an umbrella term that covers the whole class of technology that protects digital data - either hardware or software based. Your monitor is not involved, but the cabling is.
HDCP is one of the possible components of DRM. Simply being able to support HDMI cabling (which supports HDCP) makes something DRM. As far as I know, all electronic devices that can playback digital media made in recent years are supposed to support DRM to keep you from stealing encrypted content.
HDCP is hardware supported digital protection - in this case the DVI or HDMI cabling. One example of the problems with DRM is that not all versions of HDMI supports HDCP correctly - I believe it is completely supported on HDMI versions 1.2 and above. That is one of the reasons many older HD televisions have problems with newer HDMI/DVI between devices - and why my old PC monitor at home does not work directly with my PVR, even though it accepts the cabling hardware - it doesn't read it correctly unless if it goes through my PC card interface which recognizes the newer HDMI version which correctly supports HDCP.
I have had no issues playing back videos I downloaded from the Internet, both encrypted and non, so I am not really sure what the fuss is yet. There have been rumors that Vista reports back all media that has encryption back to Microsoft, but I don't believe that yet. However, I'm sure I'll find out soon enough.
HDCP's main target is to prevent transmission of non-encrypted high definition content. Three systems were developed to achieve that goal:
1. Authentication process disallows non-licensed devices to receive HD content.
2. Encryption of the actual data sent over DVI or HDMI interface prevents eavesdropping of information. It also prevents "man in the middle" attacks.
3. Key revocation procedures ensure that devices manufactured by any vendors who violate the license agreement could be relatively easily blocked from receiving HD data.
Each HDCP capable device model has a unique set of keys; there are 40 keys, each 56 bits long...
...HD DVD and Blu-ray Disc players allow content providers to set an Image Constraint Token (ICT) flag that will only output full-resolution digital signals using HDCP.
I am more or less sure that neither my TV nor my monitor supports HDCP. Both of them are connected through DVI output of the video card, monitor is connected then through DVI-to-VGA converter.
So I still do not understand if I will be able to see 1080 content, if it is has that ICT which require HDCP, on my monitor or on my TV.
The same issue that affects graphics cards also goes for high-resolution LCD monitors. One of the big news items at CES 2007 was Samsung's 1920x1200 HD-capable 27" LCD monitor, the Syncmaster 275T, released at a time when everyone else was still shipping 24" or 25" monitors as their high-end product [Note F]. The only problem with this amazing HD monitor is that Vista won't display HD content on it because it doesn't consider any of its many input connectors (DVI-D, 15-pin D-Sub, S- Video, and component video) secure enough. So you can do almost anything with this HD monitor except view HD content on it.
So, both my monitor (VGA connector) and HDTV (DVI-D) will not be able to play HD content through vista. Crap!