Quality Connector or Standard
I'm getting a new monitor and I was wondering if it was worth upgrading the standard connection piece, a tin dvi to hdmi connector, to a gold plated one, which costs 14 dollars, worth it. Or will the standard do the same thing as gold plated one?
Not worth it at all. First the signal is digital, not analog so it will either work or not (no "bad signal quality" crap possible). Second, the only reason people started plating connectors in gold is because gold doesn't oxidize (rust). If you cable ever gets oxidized enough not to work anymore, your monitor OR video card will probably be in a worst shape. Anyway this kind of cable can be replaced for less than 5$ on monoprice (even the gold-plated ones).
Zenthar's answer has a major error in it.
First of all the fact that it's a digital signal doesn't mean it'll work or not work without the possibility of a bad signal.
Digital signal still is carried along a medium, and they can still carry part of the signal, the difference being you don't get ghosting, you get sparkling instead because those single pixel are working or not, not the overall signal/picture, and that results in a crap image.
Also you guys obviously don't understand that digital signals are carried along mediums (copper, gold, glass, air) and it is their analogue interaction with these mediums that cause issues.
As for the OP's question, the difference in a connection of that short a length is tiny, so the impact of the quality of the connectors will not be measurable.
And as mentioned oxidation would be the biggest concern.
However that's not the same as the whole misunderstanding above about digital vs analogue.
Just trying to keep it simple (maybe a bit too much). In the end I know it's still an electromagnetic signal going through wires and that "digital" transmission doesn't really exist at the physical level, it's just the way the information is encoded into the signal. Same thing applies to radio waves whether it's WiFi, AM radio, FM radio, XM radio, ... That said, like in any communication, noise can happen. I think this is even then main difference between various versions of some cabling like CAT5-6-7: lower noise to achieve higher signal frequency and therefore higher speeds.
I remember seeing an article with signal analysis for HDMI cables comparing cheap ones to pricey ones (*cough* Monster *cough*) and the difference would start being noticeable only over 10m (25-30ft).
In response to the Ape. Digital signals do have the advantage of having a level of noise immunity. Analog signals have next to zero tolerance for noise or overall signal level changes. However digital signals have to be seriously compromised before the result is evident. As long as the levels for 0 and 1 aren't pushed outside of their respective tolerances, the receiving end (in this case your monitor) will translate the levels correctly. Obviously, a seriously compromised signal path will result in a very distorted image or a total lack of image.
As for gold plated connectors, the main concept is to eliminate corrosion. The other property which isn't as critical for digital signals is the fact that gold is a better conductor. Essentially lowering contact resistance. Contact resistance will cause level changes in signals (digital and analog). In the case of connections used for power (not valid in this case), contact resistance can cause the contact to heat up. Power loss also occurs as a voltage drop across the higher resistance in the contact.
techgeek said:As for gold plated connectors, the main concept is to eliminate corrosion. The other property which isn't as critical for digital signals is the fact that gold is a better conductor.
You are correct silver and copper (in that order) are better than gold for conductivity, but they are also the most reactive, thus corrode easily. As far as thin layer, something known as the skin effect (when dealing with high frequency signals) makes thin layers preferable. That is why it's generally accepted that stranded wire is better for propagating high frequency signals over solid conductors. At high frequency the majority of the electron travel occurs on the "skin" of the conductor. To increase surface area or "skin" we use many small wires spun together (stranded) to do this, thus lowering resistance to high frequency signals. The effect becomes more pronounced as frequencies increase.
For noise it depends on how the system is built to handle noise, some are better at handling noise than others (digital can also send error correction or buffer signals too), and then the resultant output can be valued differently (is an echo worse than a momentary cut-out?) and is HDMI sparkling/twinkling more noticeable than slight graininess or shifting? Both can be good and bad depending on quality of course. Remember Wireless is still digital, but most wirless video is terrible compared to good quality component, let alone BNC connections. But it's all relative, just making sure people don't confuse digital with being good analogue with being bad, and that being the end of the story.
Anywhooo... I agree with the Monster analogy, especially at lengths below 15ft for good quality 1080, or length 8-9ft for 1080P 3D you won't notice a difference, but for long distances and high bitrate demands you need quality, not necessarily Monster quality, but also not dollar store or cheap WalMart.
However for the OP's scenario the distance is so small it won't make much difference for conduction speed or pin quality so much as the corrosion as mentioned.
TheGreatGrapeApe said:But it's all relative, just making sure people don't confuse digital with being good analogue with being bad, and that being the end of the story.