I'm sure this horse has been been killed twice, but hopefully somebody can point me in the right direction.
I recently purchased a KVM switch that uses HDMI/USB connections, as I wanted it to be "future-proof" (rather than getting a D-Sub switch). My monitor, a VGA/DVI-D with HDCP support, was connected to my video card through a DVI-D (single-link) cable. Worked perfectly. I purchased a DVI-D (dual-link) to HDMI (m-f) adapter and hooked up the monitor both through the KVM switch and direct to the HDMI output on my desktop video card (Radeon 7700 series). When going direct to the video card, there are scattered green and magenta pixels flickering around the screen dynamically- relative to the image displayed. When I connect through the KVM switch, it seems to only have flickering green pixels around (correctly displayed) greys/blacks, and the flicking pixels are more numerous.
I understand that there is a long docket of posts here about the issue of going from HDMI source to DVI display, but I can't help but ask if there is anything I can do to correct this (other than buy a new monitor). Would pulling the dual-link pins from the adapter correct the issue? It doesn't seem unreasonable to think that the flickering is a result of conflicting data streams between the first and second data links. Or would pulling those pins totally ah heck the whole thing and make the HDMI signal completely unreadable through a "single link" connector?
I know the HDMI cable going into the monitor (via adapter) works properly. Based on there being a problem going direct from the computer's HDMI through the adapter and also there being a difference in the flickering pixels when I plug in via the KVM switch, I don't know if one or both of those new components is working properly or not.
Expertise of those more qualified than I would be greatly appreciated.
I recently purchased a KVM switch that uses HDMI/USB connections, as I wanted it to be "future-proof" (rather than getting a D-Sub switch). My monitor, a VGA/DVI-D with HDCP support, was connected to my video card through a DVI-D (single-link) cable. Worked perfectly. I purchased a DVI-D (dual-link) to HDMI (m-f) adapter and hooked up the monitor both through the KVM switch and direct to the HDMI output on my desktop video card (Radeon 7700 series). When going direct to the video card, there are scattered green and magenta pixels flickering around the screen dynamically- relative to the image displayed. When I connect through the KVM switch, it seems to only have flickering green pixels around (correctly displayed) greys/blacks, and the flicking pixels are more numerous.
I understand that there is a long docket of posts here about the issue of going from HDMI source to DVI display, but I can't help but ask if there is anything I can do to correct this (other than buy a new monitor). Would pulling the dual-link pins from the adapter correct the issue? It doesn't seem unreasonable to think that the flickering is a result of conflicting data streams between the first and second data links. Or would pulling those pins totally ah heck the whole thing and make the HDMI signal completely unreadable through a "single link" connector?
I know the HDMI cable going into the monitor (via adapter) works properly. Based on there being a problem going direct from the computer's HDMI through the adapter and also there being a difference in the flickering pixels when I plug in via the KVM switch, I don't know if one or both of those new components is working properly or not.
Expertise of those more qualified than I would be greatly appreciated.