Sign in with
Sign up | Sign in
Your question

DVI->HDMI cable prob (related to both ATI & Nvidia)

Last response: in Graphics & Displays
Share
December 2, 2009 10:40:48 PM

I'm posting this in a separate thread as requested (
Quote:
Please create your own thread, your issue is not related to the one experienced by the ATi cards in the thread ...

Where I replied to the post by "jwsiegel", starting with "One step forward, two steps back" .

I am experiencing similar symptoms with an Nvidia GTX 260 card as jwsiegel was with his ATI 4870 card. Both cards have DVI outputs. Both of us are connecting to displays that take an HDMI input.
In my case, I'm using a Viewsonic 1920x1200 monitor that was advertised as being DVI compatible. They included a DVI->HDMI cable in the box with the monitor. Note -- my graphics card, made by BMI, also came with a DVI->HDMI cable in the box, indicating that they, also, supported connecting the graphics card to an HDMI display.

Jwsiegel mentions in his post, having black borders, something he called underscanning. Note, this *isn't* black 'bars' above and below an image like one might see on an overwide movie that's being inset into a lower Width:height-ratio display, but is a black border all the way around the outer edge of the display. In my case, it's about .5-.75" (.5 on top and bottom, .75 on right and left side). It sounds like he was able to use a custom resolution of 1824x1032 to get around his problem and get a complete image (though smaller than it should be).

In my case, there is no resolution that works -- I can use a small resolution, but then I get thicker black bars. I tried all the way down to ~800x600 with similar results. Note -- I'm NOT trying to watch movies on this. I am trying to use this as a computer monitor. It is a 1920x1200 native resolution -- a PC resolution, so it's primary, native resolution is for a PC, though it *only* has Video inputs (HDMI, SVideo, Component and Composite Video inputs).

When I have the black-bars, the display is cut-off in those areas. Windows "thinks" that it has the full 1920x1200, but so I just don't get to see the outer edge of my desktop (including the start-bar and info-icons) when it's in this mode. I.e. it's rather unusable. What I noticed, and this may be specific to use of a DVI->HDMI cable, or may be specific to to some monitors that are not properly decoding audio, I'm not sure. But interesting to me, is that it happens on ATI HD cards and NVidia GTX cards using the *DVI->HDMI* cable to some monitor.

Additionally, I noticed that is specifically related to the audio. NVidia's control panel has an option to DISABLE HDMI Audio. When I disable it, I get a full display -- no black border. (I don't care about the audio, as I have a separate audio card for digital sound). As further proof that it is the audio, my audio card (a Creative SB derivative by Auzentech), that has the ability to add HDMI sound to an HDMI-video input. It has 2 HDMI ports (in/out), and can take as input an HDMI-video signal, then add 7.1 HD Dolby or DTS sound and send it out the other HDMI port. You can then run that to something that can handle both.

My monitor IS NOT one of those things -- in fact, if I turn off the Audio in the Nvidia card (which, if you remember -- gives me a clean, full 1920x1200 display with no black borders), but then run it through the Auzentech card (that only adds sound to the HDMI stream), I AGAIN end up with the black border around the edge of my monitor. So in my case, it's the audio being turned on, using a DVI->HDMI cable to my HDMI monitor, that causes this problem.


Maybe the signal is corrupted by trying to send it out over a DVI port. But then the drivers (ATI & Nvidia) should detect that the installed graphics card doesn't have an HDMI output (both cards showing this problem only have DVI outputs, neither have an HDMI output!). I don't know if it's an MS bug, since the hardware drivers of both manufacturers SHOULD know the outputs of the card they are driving -- but the EDID code of my monitor indicates it should be able to handle stero sound. But my guess is that trying to put sound out over a DVI cable results in a corruption of the video (and no valid sound (I do get static out the builtin speakers which I usually disable because of this problem).

I don't know if my note is off-topic or unrelated but the symptoms sure sound similar. Right now, I'm *hoping* for a driver fix from Nvidia to keep the audio off -- since any video mode change is re-enabling the audio in the HDMI stream (even though their control panel still says 'off'). To re-disable it, I go to their control panel, and try to *re-enable* audio over HDMI. There is NO screen flicker or blanking as I apply the "new"[sic] settings. But it then comes up with a Notice that my display settings have change, and do I want to keep them? (and it begins a 10-15 second countdown). To this I answer **NO**. __Then__ my screen blanks, flickers..etc, as it switches mode out of the "audio-enabled-HDMI" mode back to the "audio-disabled-HDMI" mode.

When I say that the audio-enabled-HDMI is reset to on after any video mode change, I mean in situations like: coming out of standby, any power cycling, any disconnect/reconnect of cable, any resolution change done by an application (like most games!!). I have to goto the Nvidia control panel and go through the above described reset process after every reboot, or if I let my monitor "go to sleep", or anytime I turn it on. I also can't play most games -- since as soon as I enter (say, Dragon Age, for example), it switches screen mode to a lower resolution -- even if I later configure it to use 1920x1200 native, it still resets the video upon entry "just to make sure" the graphics HW really is in the desired mode. So once it resets my screen, I'd need to "alt-tab" out to fix it in the NV-Cpl, but when I ALT-TAB back to the game, it resets the mode again, undoing my mode change in the NV-CntrPnl!

If anyone else has any ideas on a work-around on this, I'd enjoy hearing them, but at this point, I'm sorta screwed.

Note -- this wasn't a problem with MS's Nividia driver under XP, as it disabled (I presume) the Audio. But the latest drivers on Nividia's website don't work under XP(32bit) either. Under Vista-64 OR Win7-64, the drivers installed by MS show the same problem.

Note -- as mentioned earlier -- the same symptom shows up on ATI graphics cards that have DVI outputs where the end-user is using a DVI->HDMI cable to attach some type of HDMI-input based display.

I didn't know that DVI supported audio -- but an NVidia engineer told me that it does. News to me. Maybe it doesn't support it ""Well"" ?

Linda





March 3, 2010 4:55:02 AM

I have this exact same problem. I just downloaded the latest drivers (March 2, 2010) from Nvidia and now the 'trick' to reset the HDMI audio does not even do anything in regards to reseting the video and thus it does not fix the problem at all.

Just wondering if you or anyone else found a fix to this problem. I will be rolling back to the previous Nvidia drivers and continuing to use the work around trick of just reseting HDMI audio to disabled (despite it sometimes saying disabled already) to get my card to interact with my monitor properly.

*EDIT*
Just thought I should add: I currently use an Nvidia GTS 250 in Win 7 it was a replacement to a burnt out Nvidia 9800GTX+ which also portrayed this problem in Win Vista. I have previously used a Nvidia 7950 GX2 without any problem with this same monitor, in both Windows 7 (interim period waiting for GTS 250) and Windows Vista. However, I do believe that that older card has no option whatsoever for HDMI audio... thus it is disabled by default and would create the same (and proper) situation as the trick that is being used for my current card to achieve proper display of 1920x1200.

I have a Viewsonic VX2435wm monitor that has a speaker that I never use (have sound going to 5.0 system) and has several other connections that I don't currently use for video display as well.
March 4, 2010 5:00:47 PM

Trean said:
I have this exact same problem. I just downloaded the latest drivers (March 2, 2010) from Nvidia and now the 'trick' to reset the HDMI audio does not even do anything in regards to reseting the video and thus it does not fix the problem at all.

Just wondering if you or anyone else found a fix to this problem. I will be rolling back to the previous Nvidia drivers and continuing to use the work around trick of just reseting HDMI audio to disabled (despite it sometimes saying disabled already) to get my card to interact with my monitor properly.


---
What I found was that I had to toggle the Audio to "on", then when it asks me if I want to keep settings, I say "no", and that forced it back off again. Yes -- the switch is out of sync with the hardware. Nvidia can reproduce the problem in house, but it affects a small enough number of people that it isn't a priority for them to fix it.


!