Skip to main content

DisplayPort vs. HDMI: Which Is Better For Gaming?

(Image credit: Amazon)

The best gaming monitors are packed with features, but one aspect that often gets overlooked is the inclusion of DisplayPort vs. HDMI. What are the differences between the two ports and is using one for connecting to your system definitively better?

You might think it's a simple matter of hooking up whatever cable comes with your monitor to your PC and calling it a day, but there are differences that can often mean a loss of refresh rate, color quality, or both if you're not careful. Here's what you need to know about DisplayPort vs. HDMI connections.

If you're looking to buy a new PC monitor or buy a new graphics card (you can find recommendations on our Best Graphics Cards page), you'll want to consider the capabilities of both sides of the connection — the video output of your graphics card and the video input on your display — before making any purchases. Our GPU hierarchy will tell you how the various graphics cards rank in terms of performance, but it doesn't dig into the connectivity options, which is something we'll cover here. 

The Major Display Connection Types 

From left to right: Composite, VGA, DVI, HDMI, and DisplayPort.  (Image credit: Shutterstock)

The latest display connectivity standards are DisplayPort and HDMI (High-Definition Multimedia Interface). DisplayPort first appeared in 2006, while HDMI came out in 2002. Both are digital standards, meaning all the data about the pixels on your screen is represented as 0s and 1s as it zips across your cable, and it's up to the display to convert that digital information into an image on your screen.

Earlier monitors used DVI (Digital Visual Interface) connectors, and going back even further we had VGA (Video Graphics Array) — along with component RGB, S-Video, composite video, EGA and CGA. You don't want to use VGA or any of those others in 2020, though. They're old, meaning, any new GPU likely won't even support the connector, and even if they did, you'd be using an analog that's prone to interference. Yuck.

DVI is the bare minimum you want to use today, and even that has limitations. It has a lot in common with early HDMI, just without audio support. It works fine for gaming at 1080p, or 1440p resolution if you have a dual-link connection. Dual-link DVI-D is basically double the bandwidth of single-link DVI-D via extra pins and wires, and most modern GPUs with a DVI port support dual-link.

If you're wondering about Thunderbolt 2/3, it actually just routes DisplayPort over the Thunderbolt connection. Thunderbolt 2 supports DisplayPort 1.2, and Thunderbolt 3 supports DisplayPort 1.4 video. It's also possible to route HDMI 2.0 over Thunderbolt 3 with the right hardware.

For newer displays it's best to go with DisplayPort or HDMI. But is there a clear winner between the two?

Modern GPU with 2x DP and 2x HDMI ports. (Image credit: Future)

DisplayPort vs. HDMI: Specs and Resolutions 

Not all DisplayPort and HDMI ports are created equal. The DisplayPort and HDMI standards are backward compatible, meaning you can plug in an HDTV from the mid-00s and it should still work with a brand new RTX 20-series or RX 5000-series graphics card. However, the connection between your display and graphics card will end up using the best option supported by both the sending and receiving ends of the connection. That might mean the best 4K gaming monitor with 144 Hz and HDR will end up running at 4K and 24 Hz on an older graphics card!

Here's a quick overview of the major DisplayPort and HDMI revisions, their maximum signal rates and the GPU families that first added support for the standard.

DisplayPort vs. HDMI Specs

Max Transmission RateMax Data RateResolution/Refresh Rate Support (24 bpp)GPU Introduction
DisplayPort Versions


1.0-1.1a10.8 Gbps8.64 Gbps1080p @ 144 HzAMD HD 3000 (R600)



4K @ 30 HzNvidia GeForce 9 (Tesla)
1.2-1.2a21.6 Gbps17.28 Gbps1080p @ 240 HzAMD HD 6000 (Northern Islands)



4K @ 75 HzNvidia GK100 (Kepler)



5K @ 30 Hz
1.332.4 Gbps25.92 Gbps1080p @ 360 HzAMD RX 400 (Polaris)



4K @ 120 HzNvidia GM100 (Maxwell 1)



5K @ 60 Hz



8K @ 30 Hz
1.4-1.4a32.4 Gbps25.92 Gbps8K @ 120 Hz w/ DSCAMD RX 400 (Polaris)




Nvidia GM200 (Maxwell 2)
280.0 Gbps77.37 Gbps4K @ 240 HzFuture GPUs



8K @ 85 Hz
HDMI Versions



1.0-1.2a4.95 Gbps3.96 Gbps1080p @ 60 HzAMD HD 2000 (R600)




Nvidia GeForce 9 (Tesla)
1.3-1.4b10.2 Gbps8.16 Gbps1080p @ 144 HzAMD HD 5000



1440p @ 75 HzNvidia GK100 (Kepler)



4K @ 30 Hz



4K 4:2:0 @ 60 Hz
2.0-2.0b18.0 Gbps14.4 Gbps1080p @ 240 HzAMD RX 400 (Polaris)



4K @ 60 HzNvidia GM200 (Maxwell 2)



8K 4:2:0 @ 30 Hz
2.148.0 Gbps42.6 Gbps4K @ 144 Hz (240 Hz w/DSC)Partial 2.1 VRR on Nvidia Turing



8K @ 30 Hz (120 Hz w/DSC)

Note that there are two bandwidth columns: transmission rate and data rate. The DisplayPort and HDMI digital signals use bitrate encoding of some form — 8b/10b for most of the older standards, 16b/18b for HDMI 2.1, and 128b/132b for DisplayPort 2.0. 8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used to help maintain signal integrity (eg, by ensuring zero DC bias).

That means only 80% of the theoretical bandwidth is actually available for data use with 8b/10b. 16b/18b encoding improves that to 88.9% efficiency, while 128b/132b encoding yields 97% efficiency. There are still other considerations, like the auxiliary channel on HDMI, but that's not a major factor. 

 Let's Talk More About Bandwidth

(Image credit: Shutterstock)

To understand the above chart in context, we need to go deeper. What all digital connections — DisplayPort, HDMI and even DVI-D — end up coming down to is the required bandwidth. Every pixel on your display has three components: red, green and blue (RGB) — alternatively: luma, blue chroma difference and red chroma difference (YCbCr/YPbPr) can be used. Whatever your GPU renders internally (typically 16-bit floating point RGBA, where A is the alpha/transparency information), that data gets converted into a signal for your display.

The standard in the past has been 24-bit color, or 8 bits each for the red, green and blue color components. HDR and high color depth displays have bumped that to 10-bit color, with 12-bit and 16-bit options as well, though the latter two are mostly in the professional space right now. Generally speaking, display signals use either 24 bits per pixel (bpp) or 30 bpp, with the best HDR monitors opting for 30 bpp. Multiply the color depth by the number of pixels and the screen refresh rate and you get the minimum required bandwidth. We say 'minimum' because there are a bunch of other factors as well.

Display timings are relatively complex calculations. The VESA governing body defines the standards, and there's even a handy spreadsheet that spits out the actual timings for a given resolution. A 1920 x 1080 monitor at a 60 Hz refresh rate, for example, uses 2,000 pixels per horizontal line and 1,111 lines once all the timing stuff is added. That's because display blanking intervals need to be factored in. (These blanking intervals are partly a holdover from the analog CRT screen days, but the standards still include it even with digital displays.)

Using the VESA spreadsheet and running the calculations gives the following bandwidth requirements. Look at the following table and compare it with the first table; if the required data bandwidth is less than the max data rate that a standard supports, then the resolution can be used.

Common Resolution Bandwidth Requirements
Resolution Color DepthRefresh Rate (Hz)Required Data Bandwidth
1920 x 10808-bit603.20 Gbps
1920 x 108010-bit604.00 Gbps
1920 x 10808-bit1448.00 Gbps
1920 x 108010-bit14410.00 Gbps
2560 x 14408-bit605.63 Gbps
2560 x 144010-bit607.04 Gbps
2560 x 14408-bit14414.08 Gbps
2560 x 144010-bit14417.60 Gbps
3840 x 21608-bit6012.54 Gbps
3840 x 216010-bit6015.68 Gbps
3840 x 21608-bit14431.35 Gbps
3840 x 216010-bit14439.19 Gbps

The above figures are all uncompressed signals, however. DisplayPort 1.4 added the option of Display Stream Compression 1.2a (DSC), which is also part of HDMI 2.1. In short, DSC helps overcome bandwidth limitations, which are becoming increasingly problematic as resolutions and refresh rates increase. For example, basic 24 bpp at 8K and 60 Hz needs 49.65 Gbps of data bandwidth, or 62.06 Gbps for 10 bpp HDR color. 8K 120 Hz 10 bpp HDR, a resolution that we're likely to see more of in the future, needs 127.75 Gbps. Yikes!

DSC can provide up to a 3:1 compression ratio by converting to 4:2:2 or 4:2:0 YCgCo and using delta PCM encoding. It provides a "visually lossless" (or nearly so, depending on what you're viewing) result, particularly for video (ie, movie) signals. Using DSC, 8K 120 Hz HDR is suddenly viable, with a bandwidth requirement of 'only' 42.58 Gbps.

Both HDMI and DisplayPort can also carry audio data, which requires bandwidth as well, though it's a minuscule amount compared to the video data. DisplayPort and HDMI currently use a maximum of 36.86 Mbps for audio, or 0.037 Gbps if we keep things in the same units as video. Earlier versions of each standard can use even less data for audio.

That's a lengthy introduction to a complex subject, but if you've ever wondered why the simple math (resolution * refresh rate * color depth) doesn't match published specs, it's because of all the timing standards, encoding, audio and more. Bandwidth isn't the only factor, but in general, the standard with a higher maximum bandwidth is 'better.'

DisplayPort: The PC Choice 

(Image credit: Monoprice)

Currently DisplayPort 1.4 is the most capable and readily available version of the DisplayPort standard. The DisplayPort 2.0 spec came out in June 2019, but there still aren't any graphics cards or displays using the new version. We thought that would change with the launch of AMD's 'Big Navi' (aka Navi 2x, aka RDNA 2) and Nvidia's Ampere GPUs, but Ampere at least is sticking with DisplayPort 1.4a (we'll know more about AMD's Big Navi soon enough). DisplayPort 1.4 doesn't have as much bandwidth available as HDMI 2.1, but on the other hand, HDMI 2.1 hardware isn't really available for PCs yet.

One advantage of DisplayPort is that variable refresh rates (VRR) have been part of the standard since DisplayPort 1.2a. We also like the robust DisplayPort (but not mini-DisplayPort) connector, which has hooks that latch into place to keep cables secure. It's a small thing, but we've definitely pulled loose more than a few HDMI cables by accident. DisplayPort can also connect multiple screens to a single port via Multi-Stream Transport (MST), and the DisplayPort signal can be piped over a USB Type-C connector that also supports MST.

Another benefit of DisplayPort over HDMI doesn't normally impact consumers. HDMI is a licensed brand and specification, whereas DisplayPort is a royalty-free standard. That's probably why many of the display innovations like DSC, G-Sync and FreeSync came first to DisplayPort and second to HDMI — HDMI Technology is responsible for defining the standard. DisplayPort has typically offered higher maximum bandwidth than HDMI as well, and while HDMI 2.1 does beat DisplayPort 1.4, DisplayPort 2.0 regains the lead.

DisplayPort isn't fully royalty or licensing fee free, however. The original DP 1.0 used DisplayPort Content Protection (DPCP) from Philips, but HDCP (High-bandwidth Digital Content Protection) won out. DP 1.1 added HDCP 1.3 support, and DP 1.3 and later have HDCP 2.2 support, both of which are licensed from Digital Content Protection LLC.

Because the standard has evolved over the years, not all DisplayPort cables will work properly at the latest speeds. The original Display 1.0-1.1a spec allowed for RBR (reduced bit rate) and HBR (high bit rate) cables, capable of 5.18 Gbps and 8.64 Gbps of data bandwidth, respectively. DisplayPort 1.2 introduced HBR2, doubled the maximum data bit rate to 17.28 Gbps and is compatible with standard HBR DisplayPort cables. HBR3 with DisplayPort 1.3-1.4a increased things again to 25.92 Gbps, and added the requirement of DP8K DisplayPort certified cables.

Finally, with DisplayPort 2.0 there are three new transmission modes: UHBR 10 (ultra high bit rate), UHBR 13.5 and UHBR 20. The number refers to the bandwidth of each lane, and DisplayPort uses four lanes, so UHBR 10 offers up to 40 Gbps of transmission rate, UHBR 13.5 can do 54 Gbps and UHBR 20 peaks at 80 Gbps. All three UHBR standards are compatible with the same DP8K-certified cables, thankfully, and use 128b/132b encoding, meaning data bit rates of 38.69 Gbs, 52.22 Gbps, and 77.37 Gbps. 

Officially, the maximum length of a DisplayPort cable is up to 3m (9.;8 feet), which is one of the potential drawbacks, particularly for consumer electronics use. 

With a maximum data rate of 25.92 Gbps, DisplayPort 1.4 can handle 4K resolution 24-bit color at 98 Hz, and dropping to 4:2:2 YCbCr gets it to 144 Hz with HDR. Keep in mind that 4K HDR monitors running at 144 Hz still cost a premium, so gamers will more likely be looking at something like a 144Hz display at 1440p. That only requires 14;08 Gbps for 24-bit color or 17.60 Gbps for 30-bit HDR, which DP 1.4 can easily handle.

If you're wondering about 8K content in the future, the reality is that even though it's doable right now via DSC and DisplayPort 1.4a, the displays and PC hardware needed to drive such displays aren't generally within reach of consumer budgets. (GeForce RTX 3090 may change that, but it seems as though HDMI 2.1 will be the way to go there.) By the time 8K becomes a viable resolution, we'll have gone through a couple of more generations of GPUs.

HDMI: Ubiquitous Consumer Electronics 

(Image credit: HDMI.org)

Updates to HDMI have kept the standard relevant for over 16 years. The earliest versions of HDMI have become outdated, but later versions have increased bandwidth and features. 

HDMI 2.0b and earlier are 'worse' in some ways compared to DisplayPort 1.4, but if you're not trying to run at extremely high resolutions or refresh rates, you probably won't notice the difference. Full 24-bit RGB color at 4K 60 Hz has been available since HDMI 2.0 released in 2013, and higher resolutions and/or refresh rates are possible with 4:2:0 YCbCr output — though you generally don't want to use that with PC text, as it can make the edges look fuzzy.

For AMD FreeSync users, HDMI has also supported VRR via an AMD extension since 2.0b, but HDMI 2.1 is where VRR became part of the official standard. So far, only Nvidia has support for HDMI 2.1 VRR on its Turing and upcoming Ampere GPUs, which is used on LG's 2019 OLED TVs. That will likely change once AMD's 'Big Navi' GPUs are released, and we expect full HDMI 2.1 support from Nvidia's Ampere GPUs as well. If you own a Turing or earlier generation Nvidia GPU, outside of specific scenarios like the LG TVs, you're generally better off using DisplayPort for the time being.

One major advantage of HDMI is that it's ubiquitous. Millions of devices with HDMI shipped in 2004 when the standard was young, and it's now found everywhere. These days, consumer electronics devices like TVs often include support for three or more HDMI ports. What's more, TVs and consumer electronics hardware has already started shipping HDMI 2.1 devices, even though no PC graphics cards support the full 2.1 spec yet. (The GeForce RTX 3070 and above have at least one HDMI 2.1 port.)

HDMI cable requirements have changed over time, just like DisplayPort. One of the big advantages is that high quality HDMI cables can be up to 15m (49.2 feet) in length — five times longer than DisplayPort. That may not be important for a display sitting on your desk, but it can definitely matter for home theater use. Originally, HDMI had two categories of cables: category 1 or standard HDMI cables are intended for lower resolutions and/or shorter runs, and category 2 or “High Speed” HDMI cables are capable of 1080p at 60 Hz and 4K at 30 Hz with lengths of up to 15m.

More recently, HDMI 2.0 introduced “Premium High Speed” cables certified to meet the 18 Gbps bit rate, and HDMI 2.1 has created a fourth class of cable, “Ultra High Speed” HDMI that can handle up to 48 Gbps. HDMI also provides for routing Ethernet signals over the HDMI cable, though this is rarely used in the PC space.

We mentioned licensing fees earlier, and while HDMI Technology doesn't explicitly state the cost, this website details the various HDMI licencing fees as of 2014. The short summary: for a high volume business making a lot of cables or devices, it's $10,000 annually, and $0.04 per HDMI port provided HDCP (High Definition Content Protection) is used and the HDMI logo is displayed in marketing material. In other words, the cost to end users is easily absorbed in most cases — unless some bean counter comes down with a case of extreme penny pinching.

Like DisplayPort, HDMI also supports HDCP to protect the content from being copied. That's a separate licensing fee, naturally (though it reduces the HDMI fee). HDMI has supported HDCP since the beginning, starting at HDCP 1.1 and reaching HDCP 2.2 with HDMI 2.0. HDCP can cause issues with longer cables, and ultimately it appears to annoy consumers more than the pirates. At present, known hacks / workarounds to strip HDCP 2.2 from video signals can be found.

DisplayPort vs. HDMI: The Bottom Line for Gamers 

(Image credit: Shutterstock)

We've covered the technical details of DisplayPort and HDMI, but which one is actually better for gaming? Some of that will depend on the hardware you already own or intend to purchase. Both standards are capable of delivering a good gaming experience, but if you want a great gaming experience, right now DisplayPort 1.4 is generally better than HDMI 2.0, HDMI 2.1 beats DP 1.4, and DisplayPort 2.0 should trump HDMI 2.1.

For Nvidia gamers, your best option right now is a DisplayPort 1.4 connection to a G-Sync display. If you buy a new GeForce RTX 3080 card, however, HDMI 2.1 might be better (and it will probably be required if you want to connect your PC to a TV). The only G-Sync Compatible displays out now with HDMI 2.1 are TVs. Unless you're planning on gaming on the big screen in the living room, you're better off with DisplayPort right now. Ampere supports HDMI 2.1 but sticks with DP 1.2, and G-Sync PC monitors are likely to continue prioritizing DisplayPort.

AMD gamers may have a few more options, as there are inexpensive FreeSync monitors with HDMI available. However, DisplayPort is still the preferred standard for PC monitors. It's easier to find a display that can do 144 Hz over DisplayPort with FreeSync, where a lot of HDMI FreeSync displays only work at lower resolutions or refresh rates. HDMI 2.1 isn't currently a supported feature, even on Navi 1x GPUs (the RX 5000-series), so hold out for Navi 2x if you can, which should also include support for HDMI 2.1. DisplayPort 2.0 support apparently won't be coming for at least one more generation of GPUs.

What if you already have a monitor that isn't running at higher refresh rates or doesn't have G-Sync or FreeSync capability, and it has both HDMI and DisplayPort inputs? Assuming your graphics card also supports both connections (and it probably does if it's a card made in the past five years), in many instances the choice of connection won't really matter.

2560 x 1440 at a fixed 144 Hz refresh rate and 24-bit color works just fine on DisplayPort 1.2 or higher, as well as HDMI 2.0 or higher. Anything lower than that will also work without trouble on either connection type. About the only caveat is that sometimes HDMI connections on a monitor will default to a limited RGB range, but you can correct that in the AMD or Nvidia display options. (This is because old TV standards used a limited color range, and some modern displays still think that's a good idea. News flash: it's not.)

Other use cases might push you toward DisplayPort as well, like if you want to use MST to have multiple displays daisy chained from a single port. That's not a very common scenario, but DisplayPort does make it possible. Home theater use on the other hand continues to prefer HDMI, and the auxiliary channel can improve universal remote compatibility. If you're hooking up your PC to a TV, HDMI is usually required, as there aren't many TVs that have a DisplayPort input (BFGDs like the HP Omen X being one of the few — very expensive! — exceptions).

Ultimately, while there are specs advantages to DisplayPort, and some features on HDMI that can make it a better choice for consumer electronics use, the two standards end up overlapping in many areas. The VESA standards group in charge of DisplayPort has its eyes on PC adoption growth, whereas HDMI is defined by a consumer electronics consortium and thinks about TVs first. But DisplayPort and HDMI end up with similar capabilities. You can do 4K at 60 Hz on both standards without DSC, so it's only 8K or 4K at refresh rates above 60 Hz where you actually run into limitations on recent GPUs.

  • Toadster88
    what about Thunderbolt 3 in comparison?
    Reply
  • JarredWaltonGPU
    Toadster88 said:
    what about Thunderbolt 3 in comparison?
    Thunderbolt just uses DisplayPort routed over the connection. Thunderbolt 2 supports DP 1.2 resolutions, and Thunderbolt 3 supports DP 1.4. I'll add a note in the article, though.
    Reply
  • jonathan1683
    I tried to setup a g-sync monitor with HDMI lol yea didn't work and took me forever to figure it out. Easy choice now.
    Reply
  • CerianK
    Another big issue for me is input switching latency when I flip between sources, which is something missing from product specifications and reviews (unless I've somehow missed seeing it).

    I know that HDMI can be very slow (depending on monitor)... sometimes as much as 5 seconds to see the new source. I assumed that was content protection built into the standard and/or slow decoder ASIC.
    I have not compared switch latency to Display Port, so would be curious if anyone here has impressions
    .
    Honestly, I would probably pay quite a bit extra for a monitor and/or TV that has much faster input source switching.. I have lost patience for technology regression as I have aged... I recall using monitors and TVs that had nearly instantaneous source switching back in the analog days.
    Reply
  • JarredWaltonGPU
    CerianK said:
    Another big issue for me is input switching latency when I flip between sources, which is something missing from product specifications and reviews (unless I've somehow missed seeing it).

    I know that HDMI can be very slow (depending on monitor)... sometimes as much as 5 seconds to see the new source. I assumed that was content protection built into the standard and/or slow decoder ASIC.
    I have not compared switch latency to Display Port, so would be curious if anyone here has impressions
    .
    Honestly, I would probably pay quite a bit extra for a monitor and/or TV that has much faster input source switching.. I have lost patience for technology regression as I have aged... I recall using monitors and TVs that had nearly instantaneous source switching back in the analog days.
    My experience is that it's more the monitor than the input. I've had monitors that take as long as 10 seconds to switch to a signal (or even turn on in the first place), and I've had others that switch in a second or less. I'm not sure if it's just poor firmware, or a cheap scaler, or something else.

    I will say that I have an Acer XB280HK 4K60 G-Sync display that only has a single DisplayPort input, and it powers up or wakes from sleep almost instantly. I have an Acer G-Sync Ultimate 4K 144Hz HDR display meanwhile that takes about 7 seconds to wake from sleep. Rather annoying.
    Reply
  • Molor#1880
    "36.86 Mbps for audio, or 0.37 Gbps" Actually it would be 0.037 Gbps, which takes it from relatively small to a near rounding error for several of those tables.
    Reply
  • excalibur1814
    The best option is the one you have! My gaming laptop only has hdmi.
    Reply
  • bit_user
    JarredWaltonGPU said:
    My experience is that it's more the monitor than the input. I've had monitors that take as long as 10 seconds to switch to a signal (or even turn on in the first place), and I've had others that switch in a second or less. I'm not sure if it's just poor firmware, or a cheap scaler, or something else.
    I always figured it's to do with HDMI's handshaking and auto-negotiation.

    HDMI was designed for home theater, where you could have the signal pass through a receiver or splitter. Not only do you need to negotiate resolution, refresh rate, colorspace, bit-depth, link speed, ancillary & back-channel data, but also higher-level parameters like audio delay. So, probably just a poor implementation of that process, running on some dog-slow embedded processor.

    As such, you might find that locking down the configuration range of the source can speed things up, a bit.
    Reply
  • JarredWaltonGPU
    Molor#1880 said:
    "36.86 Mbps for audio, or 0.37 Gbps" Actually it would be 0.037 Gbps, which takes it from relatively small to a near rounding error for several of those tables.
    Oops, you're correct. Speaking of rounding errors, I seemed to have misplaced my decimal point. ;-)
    Reply
  • waltc3
    I'm surprised the article didn't mention TVs, as currently that's the main reason people go HDMI instead of DP, imo. I appreciated the fact that the article mentioned the loss of color fidelity the 144Hz compromise forces, although most people seem to ignore that difference. I include a link below that is informative on that topic--it's not mine but I saved the link to remind me...;) I haven't looked in a while, but last time I checked few if any TVs feature Display Ports--my TV at home is strictly HDMI. Personally, I use an AMD 50th Ann Ed 5700XT with a 1.4 DP cable plugged into my DP 1.4 monitor, the Ben Q 3270U.

    hardware/comments/8rlf2zView: https://www.reddit.com/r/hardware/comments/8rlf2z/psa_4k_144_hz_monitors_use_chroma_subsampling_for/
    Reply