Sign in with
Sign up | Sign in
Your question

Dvi to hdmi bios beep

Tags:
  • Graphics Cards
  • Multimedia
  • DVI
  • HDMI
  • Graphics
Last response: in Graphics & Displays
Share
June 22, 2011 10:15:54 PM

Hi,
I have bought an old multimedia pc with an ATI Radeon x550 in it.
It has a DVI out and I want to connect my HDTV to it using a DVI-HDMI cable.
Trouble is that connecting the two I can't even get into BIOS,I get an infinite rapid beep.
I used the cable before on another PC and it works fine. Also the multimedia PC works connected to a monitor via DVI.
As the only other output on the video card is an SVGA, I can't try both the monitor and the TV at the same time.
Any ideas as to how to solve this?

More about : dvi hdmi bios beep

a b U Graphics card
June 22, 2011 10:42:12 PM

HDMI requires a digital signal. Unfortunately, not all DVI output ports are actually digital. It sounds as though either the DVI-out of your card or the cable itself is analog-only, which means a DVI-to-HDMI converter won't work.

Check out the different types of DVI port connectors at Wikipedia: http://en.wikipedia.org/wiki/Digital_Visual_Interface#C...

The only sure-fire way I can think to rectify the issue would be to replace the card or cable (whichever is actually analog) with a single or dual-link DVI-D one. (A card with an HDMI output is another option.) What I can't answer is why having an erroneous video connection would cause BIOS beeps during system start-up. That is, unless the pin-layout on the card is somehow so different than that of the cable and/or converter that somehow it's creating a short.
June 23, 2011 3:17:46 PM

Oh I remember back in the day when the x550 was new it had HDCP handshake issues with some displays. You might be experiencing the same issue, but I'm not sure if it would just beep like that. Usually the monitor or TV would claim there's a disconnected cable or no signal. The card is listed as *DVI 1.0 and HDMI compliant and HDCP ready* and it came out in like 2005 I think. It shouldn't be a resolution or refresh rate problem though because that wouldn't cause problems until your OS loaded.

@RazberyBandit
The card outputs digital since it's HDMI compliant and HDCP ready, and I would think if an adapter is DVI-HDMI or a cable is DVI-HDMI then I would imagine it would be digital too as it would be a useless creation other wise. If my logic's off let me know, no hard feelings. The only time you'd have analog DVI problems I thought was with sources with DVI ports that only put out a DVI-A signal. Like used to be the case with many boards back in the day that came with onboard DVI-A only ports.
Related resources
a b U Graphics card
June 23, 2011 6:56:06 PM

A useless creation indeed. And while it seems illogical to question the converter itself, you never really know with so many products out in the wild. When HDMI converters were first introduced, some (far too many) products simply didn't work. They were just scam products promising to allow old tech to work with new tech, and people fell for it.

Despite how unlikely, if the DVI end of the OP's converter has a DVI-A port, it is one of those scam products. Actually, the DVI port itself doesn't really matter. What matters is how it's wired internally.

Yes, I'm being a bit speculative, but with good reason.
June 23, 2011 8:10:08 PM

I agree it only matters how it's wired. And I guess fair enough, wouldn't be the first time somebody made a piece of sh*t to sell that just didn't work, ever. I think the best bet would be to throw down $50 or less on a low end 4xxx, 5xxx, or 6xxx card that has an HDMI port to use. Some are even passive cooling. Then again, you said you bought an old multimedia pc so I'm not sure what board is in the machine.
!