dvi hdmi display port

what is the diffrence btn all these i knew only about vga port today i saw on my lcd gonna buy gtx770 which 1 to use and wat is dffrnce btn analog and digital resolution i have 1080p with 60hz thnx
11 answers Last reply Best Answer
More about dvi hdmi display port
  1. Hey!

    Remember, read it like an article. Its a bit long but very informative for you.

    Here we go ...

    DVI (Digital Video Interface) -

    DVI is one of the most common digital video cables you’ll see on desktops and LCD monitors today. It’s the most similar to VGA connectors, with up to 24 pins and support for analog as well as digital video. DVI can stream up to 1920×1200 HD video, or with dual-link DVI connectors you can support up to 2560×1600 pixels. Some DVI cables or ports may include fewer pins if they are designed for lower resolution devices, so you’ll need to watch for this. If your port contains all the pins, however, it can support the max resolution with no problem. The biggest problem with DVI is that it doesn’t support HDCP encryption by default, so if your hardware only includes DVI ports, you may not be able to playback full HD Blu-rays and other HD content.

    HDMI (High Definition Multimedia Interface) -

    HDMI is the default cable on newer HDTVs, Blu-ray players, Apple TV, many new computers and video cards, and a multitude of other video devices. HDMI cables and ports are very easy to use, and are almost as easy to connect as USB devices. No more bent pins; just push and play. HDMI cables can stream digital video and audio simultaneously over the same cable. HDMI cables support up to 1920×1200 HD video and 8 channel audio. They also support HDCP encryption for the newest HD content. For almost all purposes, a single HDMI cable is all you’ll need to connect your computer or video device to your monitor or TV, and it’s almost the absolute standard digital cable.

    DisplayPort -

    DisplayPort is another new video connector that’s being included on newer equipment, especially laptops. It was designed as the successor to DVI and VGA on computers, but hasn’t seen as much adoption as either DVI or HDMI. However, it is being included on all newer Macs and many Dell, HP, and Lenovo computers. It is actually very similar to HDMI, so it streams both HD video and audio on the same cable, and can output up to 1920×1080 resolution and 8 channels of audio on a single cable.

    On the good side, DisplayPort does support HDCP, so you can use it to playback protected HD content from Blu-rays and more. You can also connect it to an HDMI or DVI port with a convertor, since the digital signal is compatible. The problem is, few monitors and TVs include DisplayPort ports, so you’ll almost have to have a convertor if you want to connect your laptop to a larger screen.

    Now from here, its TL : DR stuff .. Read at your own risk so you won't get bored. :p

    Difference between analog and digital signals -

    An analog or analogue signal is any time continuous signal where some time varying feature of the signal is a representation of some other time varying quantity. It differs from a digital signal in that small fluctuations in the signal are meaningful. Analog is usually thought of in an electrical context, however mechanical, pneumatic, hydraulic, and other systems may also convey analog signals.
    An analog signal uses some property of the medium to convey the signal's information. For example, an aneroid barometer uses rotary position as the signal to convey pressure information. Electrically, the property most commonly used is voltage followed closely by frequency, current, and charge.
    Any information may be conveyed by an analog signal, often such a signal is a measured response to changes in physical phenomena, such as sound, light, temperature, position, or pressure, and is achieved using a transducer.
    For example, in sound recording, fluctuations in air pressure (that is to say, sound) strike the diaphragm of a microphone which causes corresponding fluctuations in a voltage or the current in an electric circuit. The voltage or the current is said to be an "analog" of the sound.
    Since an analogue signal has a theoretically infinite resolution, it will always have a higher resolution than any digital system where the resolution is in discrete steps. In practice, as analogue systems become more complex, effects such as non linearity and noise ultimately degrade analogue resolution such that digital systems surpass it. In analogue systems it is difficult to detect when such degradation occurs, but in digital systems, degradation can not only be detected, but corrected as well.
    The primary disadvantage of analog signaling is that any system has noise - i.e., random variation. As the signal is copied and re-copied, or transmitted over long distances, these random variations become dominant. Electrically, these losses can be diminished by shielding, good connections, and several cable types such as coaxial or twisted pair.
    The effects of noise make signal loss and distortion impossible to recover, since amplifying the signal to recover attenuated parts of the signal amplifies the noise as well. Even if the resolution of an analog signal is higher than a comparable digital signal, in many cases, the difference is overshadowed by the noise in the signal

    The term digital signal is used to refer to more than one concept. It can refer to discrete-time signals that are digitized, or to the waveform signals in a digital system. Digital signals are digital representations of discrete-time signals, which are often derived from analog signals.
    An analog signal is a datum that changes over time-say, the temperature at a given location; the depth of a certain point in a pond; or the amplitude of the voltage at some node in a circuit-that can be represented as a mathematical function, with time as the free variable (abscissa) and the signal itself as the dependent variable (ordinate). A discrete-time signal is a sampled version of an analog signal: the value of the datum is noted at fixed intervals (for example, every microsecond) rather than continuously.
    If individual time values of the discrete-time signal, instead of being measured precisely (which would require an infinite number of digits), are approximated to a certain precision-which, therefore, only requires a specific number of digits-then the resultant data stream is termed a digital signal. The process of approximating the precise value within a fixed number of digits, or bits, is called quantization.
    In conceptual summary, a digital signal is a quantized discrete-time signal; a discrete-time signal is a sampled analog signal.
    In the Digital Revolution, the usage of digital signals has increased significantly. Many modern media devices, especially the ones that connect with computers use digital signals to represent signals that were traditionally represented as continuous-time signals; cell phones, music and video players, personal video recorders, and digital cameras are examples.
    In most applications, digital signals are represented as binary numbers, so their precision of quantization is measured in bits. Suppose, for example, that we wish to measure a signal to two significant decimal digits. Since seven bits, or binary digits, can record 128 discrete values (viz., from 0 to 127), those seven bits are more than sufficient to express a range of one hundred values.
    Summary: Digital communication systems offer much more efficiency, better performance, and much greater flexibility.
    Analog in a watch is where you have to read the numbers. Digtal shows the numbers for you.
    a digital signal is what a computer system is based around ; mainly zeros and ones / or noughts and ones as illustrated .
    a zero equates to zero volts approx .
    a one ( logic ) is 5 volts +_ a tolerance value.
    but there is limited range of signal in between these 2 points.
    a measured value of 2.5 volts would not be equal to either a logic 1 or nought .
    when a circuit / usually a transistor device switches on or off the voltage at its terminal usually changes from zero to 5 volts or logic 1 .
    the digital circuit only recognises values at or around these 2 points and interprets them as a logic 1 or 0.
    in the case of analogue signal the value could change between a negativce value to posative or from zero to a posative value, within the supply constraints
    and still be recognised .

    Hope that explains everything. :)

  2. lol what is hdcp and what should i choose for gaming dvi i or dvi d
  3. High-bandwidth Digital Content Protection.

    That is all BS, its kind of HD video encryption which DVI is not able to break. In other words you can't watch blu ray movies, and other hd / full HD copyrighted stuff on a DVI monitor. ;)

    For gaming, just get a 1920 x 1080 LED Monitor and connect it by HDMI cable. You are good to go. Nothing else is needed. :P
  4. oh i am using vga analog isnt it good?
  5. Its okay but as I said, different types of cables have their own pros and cons. And in this case VGA has some limitations (I said above). In all cases if you have HD / Full HD screen, you should opt for HDMI cable instead of DVI / VGA.
  6. but dvi and hdmi r same just hdmi has audio but i have speakers
  7. No its not.

    You can not view HD encrypted content (movies?) on DVI and VGA.
  8. didnt undrstnd hdcp how would i know if i am buy hdcp movie?
  9. Buy a movie in a blu ray. Try to play it, you would know. ;)

    As far as full HD games matter, you won't have any problems.
  10. i dont watch blu ray
  11. Best answer
    Then you are fine with a DVI. :)

    You don't need to switch. If you still want, you can buy a VGA to DVI adapter and use it.
Ask a new question

Read More