Sign in with
Sign up | Sign in
Your question

HDMI vs. DVI

Last response: in Graphics & Displays
Share
May 25, 2009 3:23:12 AM

which one has better image quality?

More about : hdmi dvi

a b U Graphics card
May 25, 2009 3:41:13 AM

Have a samsung 24" High Def TV/Monitor (1920 x 1200). Tried both, Used a DVI to HDMI cable and a DVI cable. Didn't notice any diff in picture quality. The only diff was that with the DVI -> HDMI cable was that sound was enabled on monitor speakers without using a audio cable.
a c 362 U Graphics card
May 25, 2009 3:47:12 AM

Technically both should offer the same quality. However, use of lower quality parts can affect output of either DVI and/or HDMI.

Related resources
May 25, 2009 4:41:44 AM

DVI and HDMI are practically the same, HDMI just offers video+audio in one cable.
May 25, 2009 6:02:15 AM

it's the same video signal on both. HDMI adds sound and the new HDMI will add ethernet when they come out. Unlike old analog cables both of those are digital signals. which means the cheap cables are just as good as the expensive ones.
a b U Graphics card
May 25, 2009 6:25:20 AM

^^ QFT

DVI and HDMI carry the same video signal.
HDMI can also carry sound (up to 7.1 lossless)
All signals are digital, so a $0.99 wire will give the same quality as a $49.99 one (I'm looking at you, Monster Cables)
May 25, 2009 6:45:26 AM

DVI can carry sound on some GPU's, if its on Nvidia you'll need a SPDIF in onto the GPU. If its ATI then you can just use the DVI to HDMI adapter.
a b U Graphics card
May 25, 2009 3:10:25 PM

Concur with jaguarskx - on quality of parts.
Like to correct a misconception.
1. Cheap cables are as good as expensive cables. Unlike my son who sent me a $120 HDMI cable to check out a HDMI problem, I tend to use mid priced cables. There is a difference.

2. A 99 Cent piece of wire will work as it Digital - simply not true.

A digital signal is made up of square waves. The leading and laging edges represent a very high frequence (higher than 6 MHz bandwith of Analog TV) and are easily distorted. Factors that affect this, and the cost, are primarily shielding which can effect interwiring capacitance and the RCL value of the signal cable. As you move up from 480i to 1080P these factors be come more critical.

Repeat I'm not an advicate for overpriced, expensive HDMI/DVI cables, But I do shy away from thoes "El-Cheap" ones.
a b U Graphics card
May 25, 2009 3:35:15 PM

RetiredChief said:
Concur with jaguarskx - on quality of parts.
Like to correct a misconception.
1. Cheap cables are as good as expensive cables. Unlike my son who sent me a $120 HDMI cable to check out a HDMI problem, I tend to use mid priced cables. There is a difference.

2. A 99 Cent piece of wire will work as it Digital - simply not true.

A digital signal is made up of square waves. The leading and laging edges represent a very high frequence (higher than 6 MHz bandwith of Analog TV) and are easily distorted. Factors that affect this, and the cost, are primarily shielding which can effect interwiring capacitance and the RCL value of the signal cable. As you move up from 480i to 1080P these factors be come more critical.

Repeat I'm not an advicate for overpriced, expensive HDMI/DVI cables, But I do shy away from thoes "El-Cheap" ones.


You'd have to have a rediculously crappy cable to get bit loss from noise and distortion, but you are correct. Really though, if you buy your standard DVI cable from the computer store for 10 dollars you won't notice a difference in getting a gold plated one for 60. If you are using several unshielded wires you found on an old phone to go from pin to pin on the gpu to monitor you might have more than just a problem with cabling in your life ;) 

I think everyone knows there is a point that cheep is not ideal, even in digital. What really drives me nuts are places like Best Buy where they tell people that they shouldn't use the cables that come with the tv and should use the 150 buck monster cable hdmi or else the quality of the picture won't be as good.
May 25, 2009 6:29:34 PM

+1 to strangestranger. I agree. :lol: 


AKM880
May 25, 2009 9:50:06 PM

AKM880 said:
DVI can carry sound on some GPU's, if its on Nvidia you'll need a SPDIF in onto the GPU. If its ATI then you can just use the DVI to HDMI adapter.


If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable. The reason you can convert DVI to HDMI is that its the same video signal HDMI is the upgraded DVI. HDMI added sound to DVI, and the next gen will include networking as well while using the current HDMI connector.

And yes do not get the ghetto cable that cracks when you try to install it but anything of decent build will suffice. I lost the link but they have done testing and there was no difference in digital cables.
a b U Graphics card
May 25, 2009 9:58:35 PM

gamerk316 said:
^^ QFT

DVI and HDMI carry the same video signal.
HDMI can also carry sound (up to 7.1 lossless)
All signals are digital, so a $0.99 wire will give the same quality as a $49.99 one (I'm looking at you, Monster Cables)


I heard that Monster Cable is notorious for ripping people off, and for suing any other company with the word "Monster" in their name...
May 25, 2009 10:32:17 PM

All of the cable companies rip people off. Wholesale on a $80-$120 cable is only $7-$20. I've done jobs were we made more money on the cables than on the 50inch tv or 7 channel receiver.

FYI
We are required to sell at their prices or we lose our contact and have to pay as much as the rest of the world.
May 26, 2009 12:57:31 AM

monster cables are mega garbage, its almost like the bose brand... they both perform as well as they're 1/2 the price counterparts

their cables are strong though, but not worth the premium

anyway, hdmi is basically dvi with the ability to transfer sound, and the funny thing is there are audio-dvi cables out there
a b U Graphics card
May 26, 2009 1:09:52 AM

The gold plated ones are supposedly resistant to oxidization that usually takes place over time with normal plain metal cables. The HDMI cables I usually use are the $32 6' gold plated Vizio HDMI 1.3b ones from wal-mart lol (That is a good price isn't it?)
May 26, 2009 6:54:04 AM

I get these generic ones at the local computer shop :lol: . They are pretty cheap, but work fine.
a b U Graphics card
May 26, 2009 12:26:35 PM

RetiredChief said:
Concur with jaguarskx - on quality of parts.
Like to correct a misconception.
1. Cheap cables are as good as expensive cables. Unlike my son who sent me a $120 HDMI cable to check out a HDMI problem, I tend to use mid priced cables. There is a difference.

2. A 99 Cent piece of wire will work as it Digital - simply not true.

A digital signal is made up of square waves. The leading and laging edges represent a very high frequence (higher than 6 MHz bandwith of Analog TV) and are easily distorted. Factors that affect this, and the cost, are primarily shielding which can effect interwiring capacitance and the RCL value of the signal cable. As you move up from 480i to 1080P these factors be come more critical.

Repeat I'm not an advicate for overpriced, expensive HDMI/DVI cables, But I do shy away from thoes "El-Cheap" ones.


Not really. 0 or 1, high or low; those are your only choices.


A digital signal waveform: (1) low level, (2) high level, (3) rising edge, and (4) falling edge.

Source: Wikipedia.org

Sure, signal corruption can occur, but suffice it to say, its not that likely, as theres only two possible states the signal could ever be. An analog signal would have interfearance issues though...


As for DVI audio, the audio is carried from the GPU using a custom DVI-HDMI converter; using a standard converter will not give the audio signal. I'm guessing NVIDIA/ATI use a slightly different DVI pin layout (and extra pin or two) to carry the audio signal when one of their converters are used. Standard DVI implementations do not carry an audio signal, however, and most devices wouldn't know what to do with a audio signal over DVI even if you managed to carry one.
May 26, 2009 3:59:46 PM

Like others said, they're basically the exact same quality. Only difference is HDMI can carry an audio signal while DVI cannot. That's more for TVs though then it is computer monitors.
a b U Graphics card
May 26, 2009 6:40:17 PM

Hindesite said:
Like others said, they're basically the exact same quality. Only difference is HDMI can carry an audio signal while DVI cannot. That's more for TVs though then it is computer monitors.

Tell me about it. My moniter actually has a Optical Digital output (for audio over HDMI), but I found out after the fact that DD/DTS signals would not carry properly, so I needed an Optical Digital switch as well as a HDMI switch...
a b U Graphics card
May 27, 2009 3:09:38 AM

I knew I would get some negative feedback – Planned on keeping my fingers away from the key board. He__ I’ll probably catch more.

I am NOT saying cheap HDMI cables will not work, and I did not recommend the very expensive/overpriced cables. However I stand by what I said – There is a difference, and as the old saying goes you get what you pay for.

Gamerk316
You gave a very good example and explanation, and from a very good source. The waveform you illustrated would indeed be fine. The problem is that the distortion is not trapezoid. The leading edges/lagging edges have a slope that looks like an RC charge/discharge curve and your waveform does not show “no man’s land” (Old definition roughly voltage that is between 1 volt and 4.75 V for TTL logic). What happens is that depending on the PW and the RCL value the raising edge may not rise high enough to trigger a “1”, nor may it fall enough to provide a “0”. This occurs when the PW is =< 4 TC’s.

Edeawillrole
“The gold plated ones are supposedly resistant to oxidization….” Take out the word supposedly and you win the “gold” metal. The reason for using gold is NOT that it is a good conductor; in fact it is a 3rd rate conductor (Ag and Cu being better). But gold is very slow at oxidizing and Metal oxides are insulators.
A side benefit of gold is that it a Slippery” metal – low coefficient of fiction compared to say copper on copper which allows for easier mate/demates on tight connections.
The use of tinned contacts are just as good for 1 to 5 years depending environmental conditions and weather both contacts are of the same metal or dissimilar metals; But at some point they will oxidize and create problems when working with high frequencies/sq waves.

My computer knowledge may be average, but my electronic background is a might on the high side.
a b U Graphics card
May 27, 2009 7:06:04 AM

505090 said:
If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable. The reason you can convert DVI to HDMI is that its the same video signal HDMI is the upgraded DVI. HDMI added sound to DVI, and the next gen will include networking as well while using the current HDMI connector.


Hindesite said:
Like others said, they're basically the exact same quality. Only difference is HDMI can carry an audio signal while DVI cannot. That's more for TVs though then it is computer monitors.


Dudes, you can and have been able to carry audio over DVI for many years, the SPDIF reference was not to an additional connector to carry the audio separately it's in reference to the way in which ATi and nV's chips handle the audio processing, with the ATI chips using a protected internal path, and the nVidia solutions taking and external SPDIF input (either 2 wire from the audio card header, coax or toslink) and then adding them to the TMDS signal to output on the DVI connector where it can then be sent either directly to an adapter or even carried on a DVI cable/connector and then to the adapter.

What matters is how people use the DVI standard which has bult in flexibility for additional data channels, and both ATi and nVidia exploit different techniques to send the audio through DVI. ATI uses the data channels, nVidia inserts the audio signal between the video signal, both methods are supported.

Seriously, have you guys missed the whole HD2K, 3K, and 4K generaion of graphics cards, and nV's high end G200 cards? This isn't new.

Speaking of not new.....

gamerk316 said:
As for DVI audio, the audio is carried from the GPU using a custom DVI-HDMI converter; using a standard converter will not give the audio signal.


I told you this last time, nVidia does NOT need a special converter, only ATi does, because of the way they send the signal, and you can even do it over a standard cable and generic adapter. Have you ever even tried these solutions before commenting on them? You should research it before posting again saying "I'm guessing" just like so many times you post, like in the previous thread on the subject.

Anywhoo...... the most important difference between DVI and HDMI is the higher bit-depth support in spec for HDMI 1.3 with 'deep color' and it's support for 12bit and 16bit per channel colour, whereas DVI is limited by spec to 8 and 10 bit (10bit being the low end of deep color) both of which are also supported by HDMI.

Of course just like ATi and nV tweaked with the standard DVI interface, you can send 12 bit per channel colour over DVI which is already done for dedicated hardware that supports it like some of SONY's broadcast gear.
a b U Graphics card
May 27, 2009 7:30:02 AM

RetiredChief said:
I knew I would get some negative feedback – Planned on keeping my fingers away from the key board. He__ I’ll probably catch more...

Gamerk316


Don't worry about Gk316, he gets more wrong than he gets right, and using Wiki as a justifier doesn't help.

Trying to explain the analogue properties of signal transmission would be lost on him because it's digital, it's all digital, end to end, as if resistance wasn't a physical property that had a distinctly analogue effect on a signal, same with signal propagation over the air regardless of signal type/source it travels through and analogue medium of some kind and thus distance plays a major role, and can be affected by the quality of the medium over that distance.

Like you I don't recommend high quality cables for the majority of people, but if they're sending HDMI over distances greater than 25ft, then it starts to make a difference, and over 50ft you don't just get small artifacts like sparkles, you'll often see major issues or even no picture due to poor signal quality.
May 27, 2009 12:28:13 PM

Not all DVI got same picture quality as HDMI. I think DVI-D and HDMI are comparable.
a b U Graphics card
May 27, 2009 12:33:04 PM

TheGreatGrapeApe said:
Don't worry about Gk316, he gets more wrong than he gets right, and using Wiki as a justifier doesn't help.

Trying to explain the analogue properties of signal transmission would be lost on him because it's digital, it's all digital, end to end, as if resistance wasn't a physical property that had a distinctly analogue effect on a signal, same with signal propagation over the air regardless of signal type/source it travels through and analogue medium of some kind and thus distance plays a major role, and can be affected by the quality of the medium over that distance.

Like you I don't recommend high quality cables for the majority of people, but if they're sending HDMI over distances greater than 25ft, then it starts to make a difference, and over 50ft you don't just get small artifacts like sparkles, you'll often see major issues or even no picture due to poor signal quality.


What you are arguing, is that the cable itself could have a noticable effect on a data stream consiting of nothing but 0's and 1's, which is not the case. Sure, beyond a certain distance, thanks in part to strength of signal, data will end up getting lost (resulting in blank or unchanged pixels), thats part of resistance, and I'm not arguing that point. I am arguing that, distance restrictions aside, a cheap HDMI cable will almost never end up corrupting a signal to the point of noticable effect. Assuming every HDMI cable is made to support HDMI spec, there is minimal chance data corruption will occur on a standard 25ft cable. (Hence, why most cables are limited to just 25ft)

If that were the case, the setup I have (going through two seperate HDMI switches (with singal boosters, obviously)), I would be seeing/hearing plenty of artifacts, but thats simply not occured.
May 27, 2009 4:41:40 PM

TheGreatGrapeApe said:
Dudes, you can and have been able to carry audio over DVI for many years, the SPDIF reference was not to an additional connector to carry the audio separately.......................
Seriously, have you guys missed the whole HD2K, 3K, and 4K generaion of graphics cards, and nV's high end G200 cards? This isn't new.

Speaking of not new.....
.


I could be confused seeing as i haven't used a monitor with integrated speakers since my 486, but are you referring to those cards that come with a DVI to HDMI adapter?
May 27, 2009 7:06:51 PM

505090 said:
I could be confused seeing as i haven't used a monitor with integrated speakers since my 486, but are you referring to those cards that come with a DVI to HDMI adapter?

Nowadays monitors with buit in speakers are available. Many BenQ monitors come with Speakers - 2 x 1W each.
BenQ E2200HDA 21.5-inch Full HD LCD Monitor is one of them.
a b U Graphics card
May 27, 2009 7:14:16 PM

505090 said:
I could be confused seeing as i haven't used a monitor with integrated speakers since my 486, but are you referring to those cards that come with a DVI to HDMI adapter?


Actually it's specific cards.

Many cards come with a simple DVI-HDMI for the video, however some video cards support Audio over DVI to an HDMI adapter. All the HD2xxx, 3K, and HD4K cards support audio over DVI-HDMI via the internal protected path, but require a specialized adapter. nVidia has some cards that offer an SPDIF input (of the 3 types I mentioned above) to then blend the signals to be sent over HDMI, older models (GF9 generation) require a specific adapter, newer G2xx based solutions do it differently inserting it within the signals.

This is more for TVs and AV systems than for connecting to computer monitors, although as mentioned above there are monitors now (and more) that will support 2.1 through HDMI.

If you look at the GTX285 picture here, you can clearly see the SPDIF header for the 2 pin/wire Audio card header as an input right beside the power connectors;

http://images.bit-tech.net/content_images/2009/01/bfg-t...

Zotac and a few others offered early GF9xxx series cards with a similar solution that used coax-SPDIF & Toslink SPDIF inputs.
a b U Graphics card
May 27, 2009 7:24:32 PM

gamerk316 said:
What you are arguing, is that the cable itself could have a noticable effect on a data stream consiting of nothing but 0's and 1's, which is not the case. Sure, beyond a certain distance, thanks in part to strength of signal, data will end up getting lost (resulting in blank or unchanged pixels), thats part of resistance, and I'm not arguing that point. I am arguing that, distance restrictions aside, a cheap HDMI cable will almost never end up corrupting a signal to the point of noticable effect. Assuming every HDMI cable is made to support HDMI spec, there is minimal chance data corruption will occur on a standard 25ft cable. (Hence, why most cables are limited to just 25ft)


Understand however that what you are arguing drops the reason RC, SS (and I) mention quality, specifically for higher stress situations, like longer distances, or increased frequency (1080P 60fps), where shielding, wire material and diameter matter to carrying the signal cleanly those distances. We're not talking about sub 15ft cables on low freq easily corrected signals.

Like both SS and RC say, in general the medium quality stuff will be more than fine, but don't always think the el cheapos will do for every situation.
But the 3-6ft Monster Cables really have little to no place for 99.9% of users, I'll agree on that.
a b U Graphics card
May 27, 2009 7:42:13 PM

Goddie, an agreement! :D 

And for the record, audio over DVI (even passed using HDMI) is not part of the official DVI spec. And the reason why your seeing speakers on moniters is thanks in part to HDMI.
May 27, 2009 8:27:42 PM

TheGreatGrapeApe said:
Actually it's specific cards.

Many cards come with a simple DVI-HDMI for the video, however some video cards support Audio over DVI to an HDMI adapter. All the HD2xxx, 3K, and HD4K cards support audio over DVI-HDMI via the internal protected path, but require a specialized adapter. nVidia has some cards that offer an SPDIF input (of the 3 types I mentioned above) to then blend the signals to be sent over HDMI, older models (GF9 generation) require a specific adapter, newer G2xx based solutions do it differently inserting it within the signals.

This is more for TVs and AV systems than for connecting to computer monitors, although as mentioned above there are monitors now (and more) that will support 2.1 through HDMI.

If you look at the GTX285 picture here, you can clearly see the SPDIF header for the 2 pin/wire Audio card header as an input right beside the power connectors;

http://images.bit-tech.net/content_images/2009/01/bfg-t...

Zotac and a few others offered early GF9xxx series cards with a similar solution that used coax-SPDIF & Toslink SPDIF inputs.


Those cards are using a proprietary variation of DVI which is then converted to HDMI, HDMI being the actual cable you are using. Sound is not part of the DVI standard and does not go over a DVI cable (or did i miss something). Yes they are using the connector on the card but that is not the DVI standard. Just cause i can run video to my tv using a piece of speaker wire does not make the piece of speaker wire an rca cable.
a b U Graphics card
May 27, 2009 9:56:53 PM

The thing that seems to be the barrier here is no one is saying Audio over DVI is part of the official spec (as are many things on HDMI until they became v 1.1, 1.2, 1.3, etc) , however you guys said;

505090;
"If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable."

Which is plain WRONG. No one said it's wrong because of spec, we said it's wrong, and they USE the spec in their own way, but both data pins and interval data are supported by spec, so how you exploit that is another story, but of course Audio is not specifically part of the spec, but that's the same as 7.1 DTS-HD Master is not part of the PCIe spec, however you could send that over the interface.

And G, your error was similar in the statement regarding nV's updated solution we've discussed before requiring the special adapters. I didn't disagree that ATi's is adapter specific, and even nV's old implementation (really the AIB's implementations) required the specific hardware, however that's not the universal case, nor does it even change that AUDIO can be carried over the DVI port/cable/etc.

That's the point, no one is saying it's part of the spec, which actually had nothing to do with the original discussion of audio or DVI in this or in other threads, they are just saying it is possible, and often in response to non-sequiturs about what can be done, and what was accounted for in the original spec.

That would be like saying General Purpose computing cannot be done on a graphics card because it is not accounted for in either the original OpenGL or DirectX specs. The idea of retasking hardware for non-traditional roles not provided for in the original spec is what gives great solutions like audio over DVI for specialized solutions.

Had either of you stuck to just the spec that wouldn't have caused any issue, but saying it can't be done, ignores that it IS already being done, and done rather well too on something never intended to carry audio, let alone 7.1 HD audio.
May 27, 2009 11:41:28 PM

yes the spidf comment was incorrect i assumed a digital link either toslink or coaxial was being used, nut that is apparently not what they meant. My only point is if i convert the dvi port on my computer to hdmi then use a hdmi cable to go to my moniter where it plugs into a hdmi port I do not consider that to to be a dvi connection. Just me though.
May 29, 2009 6:42:02 PM

I posted this because I got a new LCD which was smaller in size from my last and noticed that the graphic quality in games seemed more blocky and the textures were scrunched together in noticable blocks. So I popped in my HDMI cable used in the old LCD and the blocky texture appearence is just gone. It's like night and day HDMI > DVI.
May 29, 2009 7:55:45 PM

http://en.wikipedia.org/wiki/HDMI

HDMI is running a 340 MHz digital signal, and it's a consumer connector (needs to withstand minor consumer "oops", e.g. you're positioning the TV, and bump the cable into the wall with a really tight radius, stressing the connector & the cable.

that's not a trivial design, although given the manufacturing volume, you can make a high-quality connector in the $1-$5 range.

yes, Monster has mark-ups on their mark-ups. that they sue people for using the word 'monster' in a company name is insult to injury.

http://gizmodo.com/gadgets/hdmi-cable-battlemodo/the-tr...

there's an article about them, they did it right with signal analyzers etc.
May 31, 2009 1:17:22 PM

Hi, I posted a similar question a few weeks ago. In theory, DVI = HDMI video-wise, with HDMI having audio aswell.

I tested this with a graphics card I bought recently (ATI Radeon Sapphire 4350):

1. DVI port connected to a HDMI port on my tv via DVI->HDMI cable.

2. HDMI port connected to a HDMI port on my tv via HDMI cable.

3. both cables are from the same cable supplier and are gold plated.

4. I switch channels to try and notice the difference.

I get a better picture with the DVI output and the difference is noticeable.

I tried playing around with the resolutions again yesterday but the HDMI picture does not improve, any ideas?
!