Is HDMI worth it?

manosteel

Distinguished
Apr 22, 2008
6
0
18,510
Hi all,

I'm building a new gaming system and am planning on purchasing an 8800GTS G92 512MB GPU. Now, I've always been intrigued by the HDMI connectivity and am interested in going with HDMI over DVI. However, this card just has dual DVI and TV connectivity.

My question is: Is it worth the money? I want an impressive looking image and will either be using a 22 or 24 widescreen LCD. That said, what type of card and monitor would you suggest for a mid-ranged budget?

Thanks!
 

boonality

Distinguished
Mar 8, 2008
1,183
0
19,310
HDMI is an HD connection to an HD TV. don't even worry about it if your using an LCD computer monitor. Now if you eventually want to get a HUGE LCD then look at "dual link DVI"
 
The HD 3870X2 comes with a DVI to HDMI adapter. I wonder if you can just buy an adapter like that and use it with any card, like the 8800GTS or 9800GTX.

Personally, I'd rather have the DVI output of the card linked to the DVI input of the monitor, with no adapters in between. The fewer links in the chain, the less chance of introducing noise.

You should probably compare screenshots made with an 8800GTS and an HD 3870X2 and then pick the one that looks best to you. There are people who swear ATI cards make better looking images, but also people who say nVidia is better at it.
 

B-Unit

Distinguished
Oct 13, 2006
1,837
1
19,810
HDMI is just an extention of the DVI protocol, so you wont see any difference in picture. What you do gain with HDMI is intigrated sound and some control protocols i think (cant remember beyond sound)
 
G

Guest

Guest




No you don't. You can use a DVI to HDMI adapter.

Bonus points for giving bad advice in a short and 'certain' statement.



If you're just connecting to a monitor you will not see a difference between DVI and HDMI. Manufacturers are putting HDMI in their cards so people can connect their computers to their HDTVs. But, as stated already, you can simply use a DVI to HDMI adapter to do this as well, but sound will not transfer, you will need to run that from your computer to your stereo/tv as well.
 
HDMI is just DVI w/ 5.1 audio in a smaller connector. Plus I've heard of people having issues using HDMI on some monitors and such. My buddy has issues with Vista 64bit and his HDMI monitor. I have to unplug and plug my video back into my TV with my DVI to HDMI cable if I want to play at 42"

Not worth it really, there are quirky issues. If you are making an HTPC then worry about it.
 

vertigo_2000

Distinguished
Feb 26, 2007
370
0
18,780
If you're just connecting to a monitor you will not see a difference between DVI and HDMI. Manufacturers are putting HDMI in their cards so people can connect their computers to their HDTVs. But, as stated already, you can simply use a DVI to HDMI adapter to do this as well, but sound will not transfer, you will need to run that from your computer to your stereo/tv as well.[/quotemsg]

Exactly... further more, it's just logical. I don't know of any video cards that have integrated sound as well, therefore, logically any HDMI/DVI configuration that you have hooked up to your video card is only going to output video. Audio is a whole other story.

On this note, let me pose one question to those with more knowledge than myself (which is probably most of you). Is there a difference, picture-wise, between using HDMI vs component (red, green, blue)?

I have heard mixed reports. Sorry to hijack the thread.

 

rubix_1011

Contributing Writer
Moderator
HDMI vs. component is theoretically the same quality. HDMI is digital the entire way, component is analog back to digital. I would challenge anyone to connect 2 identical TVs and DVD players and show us the difference.

Didn't Tom's do a review on this a while back?

HDMI is just one nice, neat cable vs. an octopus of wires.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I actually use the DVI to HDMI converter that came with my HD 3870 on my 8800GTX.
 

manosteel

Distinguished
Apr 22, 2008
6
0
18,510
I'm interested in the option of plugging my 46" HDTV into my computer. Does anyone know of a card that could handle this connection as well as DVI?
 

cmmcnamara

Distinguished
Nov 28, 2007
163
0
18,680
Yea the whole S-Video thing is what signaled my wtf :D. Yea S-Video can handle a maximum resolution of 720X480 right? If you want to hook your HDTV into your computer (I'm assuming you mean a signal from your computer to your TV not the other way around) you'd just need a card that has DVI output which is pretty much anything now. Some HDTV's can handle VGA or DVI inputs but if your TV doesn't support DVI (I don't recommend using a VGA input due to the reduced quality) you can get a DVI to HDMI adapter that will accept the DVI input from your computer and transfer it to a HDMI output to your TV. Of course if your video card you end up getting has an HDMI spot on it like the 9800 mentioned above you'll just need a straight HDMI to HDMI cable.
 

sciggy

Distinguished
Apr 15, 2008
318
0
18,780
Most people have said it already, but I'll reiterate. DVI is the same thing as HDMI. HDMI is just a DVI connection that also carries audio. Same resolution and everything is carried over. HDMI is required to run HD-DVD players and Blu-ray players with their HD audio tracks. Other than that, HDMI does give some control (such as turning a tv on when the dvd player turns on).

Someone also asked about component(red blue green) vs HDMI. Not much of a difference http://forum.ecoustics.com/bbs/messages/34579/122868.html check out this article. Only difference is really that is analog or digital conversion of the signal. Some say digital is better, but it remains to be seen.
 

Ragnorok64

Distinguished
Mar 4, 2008
75
0
18,630


I've got my old AGP Radeon 9600 Pro piping out to both my 40 HD LCD (through a DVI to HDMI cable) and my old LCD monitor through it's analog VGA out. It's running a desktop of like 3600x1080. If my old system can handle that, I'm pretty sure most of you guy's would be fine connecting to an HDTV.

I plan on building a new comp later this year.
 

sciggy

Distinguished
Apr 15, 2008
318
0
18,780
One more thing, I have an EVGA 7900 GT KO and I'm running a 42" 1080P lcd with it. I have a dvi to HDMI cable(one continuous piece not an adapter) running from the computer to the tv. It works flawlessly at native resolution of the tv(1920x1080)
 

Ragnorok64

Distinguished
Mar 4, 2008
75
0
18,630

One difference that has nothing to do with quality though, is HDCP. Basically if your source and display aren't HDCP compliant, you won't be able to get full functionality out of things like Blu-Ray discs.

I ran head first into this last year. I got a PS3 and until I could take the plunge into a HDTV I had to use my old 20" LCD to play it to get my games in at least 720p. I couldn't just use a HDMI to DVI cable because of the lack of HDCP compliance on my monitor. I had to use a converter box to plug my PS3 through component to the box then from the box to my monitor. Thing is, with that set up I couldn't watch blu-rays or dvds on my PS3 since without HDCP, it would only output 480 on movies (games still worked at 720), and my monitor couldn't handle a 480 signal (I guess it's too small a resolution or something). I was ok with it for the time since I'm a gamer that doesn't watch too many movies. I was content bide my time and eventually save up and get an HDTV.

That said when you have the option available to you, go with HDMI as opposed to component whenever you can. And check if your monitor is HDCP compliant if you think there's even a chance of installing a blu-ray drive in your machine at some point.

 

MadHacker

Distinguished
May 20, 2006
542
0
18,980
I have used a DVI to HDMI cable same as sciggy.
I have used it on my 9600GT and my 7600GT.
both worked flawlessly. I haven't tried Vista64 so I don't know of any problems there.

it is the same signal as DVI... jut a neater cable... and plugs into any available DVI port on your video card.

 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
mano,

If you want to watch Blu-Ray or other digital protected content (and decide not to install AnyDVD), you may find at a future point in time that some movies won't play over a DVI connection, but will play over an HDMI (or HDCP-enabled DVI out to HDMI in) connection. AFAIK, the DVI spec doesn't include HDCP. The most recent video cards all support HDCP. Only some monitors do, however.

Otherwise, the video portion of the two digitial connector signals are identical. They represent the video image in exactly the same way, so there is no signal conversion required from one to the other; it is strictly a physical shape difference. The HDMI spec sends the audio and control signals in the margins of the video frame; the DVI spec does not.

You don't need dual-link DVI for 1080p; single-link is completely sufficient. Unless you're looking to go to "quad" HD (3840x2160p) in the future, or want to connect to a 30" high-res LCD monitor. You don't need to worry about this.

You don't lose any signal quality from using one of those DVI-to-HDMI converters, because the signal is digital, not analog. As long as the bits get there intact, that's good enough.

Some GPU cards do output audio over the DVI out (with a DVI-to-HDMI adapter). For example, I use an HD2600 XT, and it does this. No audio will be received with a straight DVI connection, but it is with this converter. This is only truly useful, however, if you route through a receiver that can separate out the audio and forward the video to a monitor or TV.

Component is analog. It will always be lower quality than a pure-digital end-to-end natural resolution connection. A typical monitor's VGA connection is analog too, and is basically a superset of the signals sent over component. Component is typically Y/Pb/Pr or Y/Cb/Cr, whereas VGA is R/G/B/H/V. That "H" (horizontal) and "V" (vertical) help the monitor align the incoming analog data more precisely with the top and left of the screen. Because of this, VGA is a better signal than Component. DVI / HDMI are better than VGA (and hence than Component), because the digital signal doesn't have to be converted to an analog voltage level, sent over a noisy wire next to other interfering and cross-talking wires, measured (along with the noise!) at the receiving end and transformed back into a digital number, befure being rendered on a screen. I have tried both VGA and DVI-HDMI with my 46" 1080p Sony, and I can absolutely see a difference. With the VGA, the individual pixels seem to scintillate a bit, which makes sense given the noise being injected into the system. An 8-bit LCD can represent 256 different intensity levels per primary color. With a 1v p-p analog signal range, that means that as little as 3.9 mV of noise will cause a pixel to change intensity by one unit. A high-quality cable combined with good signal processing at both ends can reduce this to the point of a non-issue most of the time, but frankly I'd stick with digital to avoid it all of the time.

S-Video is not capable of anything better than 480i in the US, because it is based on the NTSC standard which is 480i. You can set your resolution as high as 1024x768, but most of those pixels will be blended together before they leave your GPU's S-Video port. I used S-Video for a while with my HTPC, before I bought an LCD HDTV and got rid of my old cathode-ray SDTV.

In summary:
- HDCP-compliant end-to-end if you want to watch Blu-Ray.
- Dual-link DVI only if you have > 1080p / 1900x1200 resolution.
- Audio over HDMI is available from some GPU cards. Check the ATi line.
- DVI=HDMI* > VGA > Component > S-Video > Composite

* HDMI 1.3 supports better 48-bit color depth, though your GPU and monitor do not. DVI can be dual-link which supports resolutions greater than 1080p, which HDMI does not.
 

leon2006

Distinguished
For big screen TV the cleaner solution is DVI to HDMI which i have in my setup

I'm using a 70 Inch HDTV as DISPLAY and it works well. Everything runs in 1080P for PC and Movie application. I watch Blue Ray and HD-DVD in 1080P

My video card(8800GT OC 512) has an Analog Output with and dongle that breaks out into COMPONENT VIDEO. In this configuration i get the same 1080P for PC use and HD Movies. The difference is i have to use more cables for this Analog Approach. I tried this and this option is confirmed to work.

TV's audio is really not that good. My audio is connected through OPTICAL LINK going to my 7.1 Receiver. I NEVER USE MY TV AUDIO.....

DVI to HDMI converter is available to any computer or audio/video store. This is small connector just like the old DVI to VGA converter. Its no big deal to acquire it. It comes standard with some video cards. Two weeks ago i tried a 3870 video card and it came with a DVI to HDMI converter.

On the 3870 card with HDMI cable, video and audio goes to the TV(one cable). I tried this option and it also works.

With DVI to HDMI i use a SINGLE CABLE going to the TV.

If you are to use a computer monitor (LCD) its not an issue since the DVI to DVI will be your video link.

If you are to use a bigger screen say an HDTV 37 inch or bigger DVI to HDMI is a better solution. Big screen HDTV have more than one HDMI , 1 or 2 DVI plus multiple traditional analog audiuo/video link.

Some LCD HDTV don't provide high resolution using DVI input. The new ones support High resolution. This was mainly due to the PARANOIA of the big RIAA members for Piracy control of HD content. HDCP works only on HDMI link for the TV.

I'm running Vista Ultimate 64. My configuration is below....I did return the 3870 card after 3days of testing. The 8800GT is a better card.
 

gimbal720

Distinguished
Feb 8, 2009
2
0
18,510
Not to put some SPAM on the stack, but just looking for somewhere to mention my own experiences about some of this, maybe it'll be useful sometime.

I have a Toshiba Satellite X205 laptop with an Nvidia graphics card of some kind. I presume it would be a NVIDIA® GeForce® 8700M GT, judging by current specs on the model. The laptop sports an HDMI output, which the Nvidia controls and the Realtek audio controls can make semi-full use of.

Recently, I picked up a Harmon Kardon AVR-254 receiver, hoping I'd be able to use it as an audio/video switch supporting connections e.g. from (A) my laptop and (B) my desktop-slash-home-network-gateway, ultimately to (P) some good speakers.

Previously, I'd had a Yamaha receiver, which I picked up about a week ago. I returned it, because when I plugged the receiver in, between the laptop and the monitor -- 24" HP w2408h monitor, namely -- well, the receiver then limited the available resolution on the monitor. With a direct connection from the laptop to the monitor, I was able to get a convenient 1900x1200. With the Yamaha receiver, I could get only a natural 1280x1024 or so, with up-scaling to other resolution.

I did some research, and found out that the AVR-254 from Harmon Kardon supports HDMI 1.3a, whereas the Yamaha receiver supported HDMI 1.2. I hoped that that would be enough to be able to get the full resolution out of my graphics card and monitor, with the receiver intervening.

So, I took another 5 hour road trip, to get this stuff taken care of with such brick-and-mortar resources as are available in Missouri.

Now that I have the AVR-254 plugged in, I find that it too cannot support the full 1900x1200 resolution of the direct laptop-to-monitor HDMI connection.

I'd read that HDMI series 1.3 specs were supposed to be able to support a wider range of resolutions, specifically for supporting computer graphics. I guess it's not wide enough.

So, I guess I'd caution anyone who would be considering using a stereo receiver between an HDMI-capable graphics card and HDMI-capable monitor, when said video components would support a natural 1900x1200 resolution with a direct connection.

As far as monitors, though, when I have my laptop connected directly to my monitor, I can get full 1900x1200 resolution, and it looks nice. HP w2408h monitor, wasn't too pricey, and it has a nice base on it. It has an HDMI input and a VGA input, allows you to switch between either. The speakers on the monitor are serviceable, but I'd recommend better speakers with it -- and good luck, at that.

For audio, with that monitor, I'd been using a pair of one-off Logitech speakers plugged into the laptop. The thinness of the sound, with those, prompted me to start piecing together a full stereo system, such that I thought I could use instead, with all the appropriate audio/video cable routing. As indicated above, that project is not working out all well.


It's off topic, sure, but I thought I'd throw this out, maybe someone would know, and I'll be darned if I know where else I could ask: If anyone knows of a stereo receiver that allows you get full 1900x1200 resolution with HDMI in from a graphics card, and HDMI out to a monitor -- when the receiver is intervening between graphics card and monitor, basically -- I could sure as sin like to know about it. I'm about spent for it, with all this driving and product-returning business. Maybe it just isn't possible with current audio/video component models....