The best gaming monitors are packed with features, but one aspect that often gets overlooked is the inclusion of DisplayPort vs. HDMI. What are the differences between the two ports and is using one for connecting to your system definitively better?
You might think it's a simple matter of hooking up whatever cable comes with your monitor to your PC and calling it a day, but there are differences that can often mean a loss of refresh rate, color quality, or both if you're not careful. Here's what you need to know about DisplayPort vs. HDMI connections.
If you're looking to buy a new PC monitor or buy a new graphics card (you can find recommendations on our Best Graphics Cards page), you'll want to consider the capabilities of both sides of the connection — the video output of your graphics card and the video input on your display — before making any purchases. Our GPU Benchmarks hierarchy will tell you how the various graphics cards rank in terms of performance, but it doesn't dig into the connectivity options, which is something we'll cover here.
The Major Display Connection Types
The latest display connectivity standards are DisplayPort and HDMI (High-Definition Multimedia Interface). DisplayPort first appeared in 2006, while HDMI came out in 2002. Both are digital standards, meaning all the data about the pixels on your screen is represented as 0s and 1s as it zips across your cable, and it's up to the display to convert that digital information into an image on your screen.
Earlier monitors used DVI (Digital Visual Interface) connectors, and going back even further we had VGA (Video Graphics Array) — along with component RGB, S-Video, composite video, EGA and CGA. You don't want to use VGA or any of those others in 2020, though. They're old, meaning, any new GPU likely won't even support the connector, and even if they did, you'd be using an analog that's prone to interference. Yuck.
DVI is the bare minimum you want to use today, and even that has limitations. It has a lot in common with early HDMI, just without audio support. It works fine for gaming at 1080p, or 1440p resolution if you have a dual-link connection. Dual-link DVI-D (opens in new tab) is basically double the bandwidth of single-link DVI-D (opens in new tab) via extra pins and wires, and most modern GPUs with a DVI port support dual-link.
If you're wondering about Thunderbolt 2/3, it actually just routes DisplayPort over the Thunderbolt connection. Thunderbolt 2 supports DisplayPort 1.2, and Thunderbolt 3 supports DisplayPort 1.4 video. It's also possible to route HDMI 2.0 over Thunderbolt 3 with the right hardware.
For newer displays it's best to go with DisplayPort or HDMI. But is there a clear winner between the two?
DisplayPort vs. HDMI: Specs and Resolutions
Not all DisplayPort and HDMI ports are created equal. The DisplayPort and HDMI standards are backward compatible, meaning you can plug in an HDTV from the mid-00s and it should still work with a brand new RTX 20-series or RX 5000-series graphics card. However, the connection between your display and graphics card will end up using the best option supported by both the sending and receiving ends of the connection. That might mean the best 4K gaming monitor with 144 Hz and HDR will end up running at 4K and 24 Hz on an older graphics card!
Here's a quick overview of the major DisplayPort and HDMI revisions, their maximum signal rates and the GPU families that first added support for the standard.
|Max Transmission Rate||Max Data Rate||Resolution/Refresh Rate Support (24 bpp)||GPU Introduction|
|1.0-1.1a||10.8 Gbps||8.64 Gbps||1080p @ 144 Hz||AMD HD 3000 (R600)|
|4K @ 30 Hz||Nvidia GeForce 9 (Tesla)|
|1.2-1.2a||21.6 Gbps||17.28 Gbps||1080p @ 240 Hz||AMD HD 6000 (Northern Islands)|
|4K @ 75 Hz||Nvidia GK100 (Kepler)|
|5K @ 30 Hz|
|1.3||32.4 Gbps||25.92 Gbps||1080p @ 360 Hz||AMD RX 400 (Polaris)|
|4K @ 120 Hz||Nvidia GM100 (Maxwell 1)|
|5K @ 60 Hz|
|8K @ 30 Hz|
|1.4-1.4a||32.4 Gbps||25.92 Gbps||8K @ 120 Hz w/ DSC||AMD RX 400 (Polaris)|
|Nvidia GM200 (Maxwell 2)|
|2||80.0 Gbps||77.37 Gbps||4K @ 240 Hz||Future GPUs|
|8K @ 85 Hz|
|1.0-1.2a||4.95 Gbps||3.96 Gbps||1080p @ 60 Hz||AMD HD 2000 (R600)|
|Nvidia GeForce 9 (Tesla)|
|1.3-1.4b||10.2 Gbps||8.16 Gbps||1080p @ 144 Hz||AMD HD 5000|
|1440p @ 75 Hz||Nvidia GK100 (Kepler)|
|4K @ 30 Hz|
|4K 4:2:0 @ 60 Hz|
|2.0-2.0b||18.0 Gbps||14.4 Gbps||1080p @ 240 Hz||AMD RX 400 (Polaris)|
|4K @ 60 Hz||Nvidia GM200 (Maxwell 2)|
|8K 4:2:0 @ 30 Hz|
|2.1||48.0 Gbps||42.6 Gbps||4K @ 144 Hz (240 Hz w/DSC)||Partial 2.1 VRR on Nvidia Turing|
|8K @ 30 Hz (120 Hz w/DSC)|
Note that there are two bandwidth columns: transmission rate and data rate. The DisplayPort and HDMI digital signals use bitrate encoding of some form — 8b/10b for most of the older standards, 16b/18b for HDMI 2.1, and 128b/132b for DisplayPort 2.0. 8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used to help maintain signal integrity (eg, by ensuring zero DC bias).
That means only 80% of the theoretical bandwidth is actually available for data use with 8b/10b. 16b/18b encoding improves that to 88.9% efficiency, while 128b/132b encoding yields 97% efficiency. There are still other considerations, like the auxiliary channel on HDMI, but that's not a major factor.
Let's Talk More About Bandwidth
To understand the above chart in context, we need to go deeper. What all digital connections — DisplayPort, HDMI and even DVI-D — end up coming down to is the required bandwidth. Every pixel on your display has three components: red, green and blue (RGB) — alternatively: luma, blue chroma difference and red chroma difference (YCbCr/YPbPr) can be used. Whatever your GPU renders internally (typically 16-bit floating point RGBA, where A is the alpha/transparency information), that data gets converted into a signal for your display.
The standard in the past has been 24-bit color, or 8 bits each for the red, green and blue color components. HDR and high color depth displays have bumped that to 10-bit color, with 12-bit and 16-bit options as well, though the latter two are mostly in the professional space right now. Generally speaking, display signals use either 24 bits per pixel (bpp) or 30 bpp, with the best HDR monitors opting for 30 bpp. Multiply the color depth by the number of pixels and the screen refresh rate and you get the minimum required bandwidth. We say 'minimum' because there are a bunch of other factors as well.
Display timings are relatively complex calculations. The VESA governing body defines the standards, and there's even a handy spreadsheet that spits out the actual timings for a given resolution. A 1920x1080 monitor at a 60 Hz refresh rate, for example, uses 2,000 pixels per horizontal line and 1,111 lines once all the timing stuff is added. That's because display blanking intervals need to be factored in. (These blanking intervals are partly a holdover from the analog CRT screen days, but the standards still include it even with digital displays.)
Using the VESA spreadsheet and running the calculations gives the following bandwidth requirements. Look at the following table and compare it with the first table; if the required data bandwidth is less than the max data rate that a standard supports, then the resolution can be used.
|Resolution||Color Depth||Refresh Rate (Hz)||Required Data Bandwidth|
|1920 x 1080||8-bit||60||3.20 Gbps|
|1920 x 1080||10-bit||60||4.00 Gbps|
|1920 x 1080||8-bit||144||8.00 Gbps|
|1920 x 1080||10-bit||144||10.00 Gbps|
|2560 x 1440||8-bit||60||5.63 Gbps|
|2560 x 1440||10-bit||60||7.04 Gbps|
|2560 x 1440||8-bit||144||14.08 Gbps|
|2560 x 1440||10-bit||144||17.60 Gbps|
|3840 x 2160||8-bit||60||12.54 Gbps|
|3840 x 2160||10-bit||60||15.68 Gbps|
|3840 x 2160||8-bit||144||31.35 Gbps|
|3840 x 2160||10-bit||144||39.19 Gbps|
The above figures are all uncompressed signals, however. DisplayPort 1.4 added the option of Display Stream Compression 1.2a (DSC), which is also part of HDMI 2.1. In short, DSC helps overcome bandwidth limitations, which are becoming increasingly problematic as resolutions and refresh rates increase. For example, basic 24 bpp at 8K and 60 Hz needs 49.65 Gbps of data bandwidth, or 62.06 Gbps for 10 bpp HDR color. 8K 120 Hz 10 bpp HDR, a resolution that we're likely to see more of in the future, needs 127.75 Gbps. Yikes!
DSC can provide up to a 3:1 compression ratio by converting to 4:2:2 or 4:2:0 YCgCo and using delta PCM encoding. It provides a "visually lossless" (or nearly so, depending on what you're viewing) result, particularly for video (ie, movie) signals. Using DSC, 8K 120 Hz HDR is suddenly viable, with a bandwidth requirement of 'only' 42.58 Gbps.
Both HDMI and DisplayPort can also carry audio data, which requires bandwidth as well, though it's a minuscule amount compared to the video data. DisplayPort and HDMI currently use a maximum of 36.86 Mbps for audio, or 0.037 Gbps if we keep things in the same units as video. Earlier versions of each standard can use even less data for audio.
That's a lengthy introduction to a complex subject, but if you've ever wondered why the simple math (resolution * refresh rate * color depth) doesn't match published specs, it's because of all the timing standards, encoding, audio and more. Bandwidth isn't the only factor, but in general, the standard with a higher maximum bandwidth is 'better.'
DisplayPort: The PC Choice
Currently DisplayPort 1.4 is the most capable and readily available version of the DisplayPort standard. The DisplayPort 2.0 spec came out in June 2019, but there still aren't any graphics cards or displays using the new version. We thought that would change with the launch of AMD's 'Big Navi' (aka Navi 2x, aka RDNA 2) and Nvidia's Ampere GPUs, but both stick with DisplayPort 1.4a. DisplayPort 1.4 doesn't have as much bandwidth available as HDMI 2.1, but it's sufficient for up to 8K 60Hz with DPC, and HDMI 2.1 hardware isn't really available for PCs yet.
One advantage of DisplayPort is that variable refresh rates (VRR) have been part of the standard since DisplayPort 1.2a. We also like the robust DisplayPort (but not mini-DisplayPort) connector, which has hooks that latch into place to keep cables secure. It's a small thing, but we've definitely pulled loose more than a few HDMI cables by accident. DisplayPort can also connect multiple screens to a single port via Multi-Stream Transport (MST), and the DisplayPort signal can be piped over a USB Type-C connector that also supports MST.
One area where there has been some confusion is in regards to licensing and royaltees. DisplayPort was supposed to be a less expensive standard (at least, that's how I recall it being proposed back in the day). But today, both HDMI and DisplayPort have various associated brands, trademarks, and patents that have to be licensed. With various associated technologies like HDCP (High-bandwidth Digital Content Protection), DSC, and more, companies have to pay a royalty for DP just like HDMI. The current rate appears to be $0.20 per product with a DisplayPort interface (opens in new tab), with a cap of $7 million per year. HDMI charges $0.15 per product, or $0.05 if the HDMI logo is used in promotional materials.
Because the standard has evolved over the years, not all DisplayPort cables will work properly at the latest speeds. The original Display 1.0-1.1a spec allowed for RBR (reduced bit rate) and HBR (high bit rate) cables, capable of 5.18 Gbps and 8.64 Gbps of data bandwidth, respectively. DisplayPort 1.2 introduced HBR2, doubled the maximum data bit rate to 17.28 Gbps and is compatible with standard HBR DisplayPort cables. HBR3 with DisplayPort 1.3-1.4a increased things again to 25.92 Gbps, and added the requirement of DP8K DisplayPort certified cables.
Finally, with DisplayPort 2.0 there are three new transmission modes: UHBR 10 (ultra high bit rate), UHBR 13.5 and UHBR 20. The number refers to the bandwidth of each lane, and DisplayPort uses four lanes, so UHBR 10 offers up to 40 Gbps of transmission rate, UHBR 13.5 can do 54 Gbps and UHBR 20 peaks at 80 Gbps. All three UHBR standards are compatible with the same DP8K-certified cables, thankfully, and use 128b/132b encoding, meaning data bit rates of 38.69 Gbs, 52.22 Gbps, and 77.37 Gbps.
Officially, the maximum length of a DisplayPort cable is up to 3m (9.;8 feet), which is one of the potential drawbacks, particularly for consumer electronics use.
With a maximum data rate of 25.92 Gbps, DisplayPort 1.4 can handle 4K resolution 24-bit color at 98 Hz, and dropping to 4:2:2 YCbCr gets it to 144 Hz with HDR. Keep in mind that 4K HDR monitors running at 144 Hz still cost a premium, so gamers will more likely be looking at something like a 144Hz display at 1440p. That only requires 14;08 Gbps for 24-bit color or 17.60 Gbps for 30-bit HDR, which DP 1.4 can easily handle.
If you're wondering about 8K content in the future, the reality is that even though it's doable right now via DSC and DisplayPort 1.4a, the displays and PC hardware needed to drive such displays aren't generally within reach of consumer budgets. (GeForce RTX 3090 may change that, but it seems as though HDMI 2.1 will be the way to go there.) By the time 8K becomes a viable resolution, we'll have gone through a couple of more generations of GPUs.
HDMI: Ubiquitous Consumer Electronics
Updates to HDMI have kept the standard relevant for over 16 years. The earliest versions of HDMI have become outdated, but later versions have increased bandwidth and features.
HDMI 2.0b and earlier are 'worse' in some ways compared to DisplayPort 1.4, but if you're not trying to run at extremely high resolutions or refresh rates, you probably won't notice the difference. Full 24-bit RGB color at 4K 60 Hz has been available since HDMI 2.0 released in 2013, and higher resolutions and/or refresh rates are possible with 4:2:0 YCbCr output — though you generally don't want to use that with PC text, as it can make the edges look fuzzy.
For AMD FreeSync users, HDMI has also supported VRR via an AMD extension since 2.0b, but HDMI 2.1 is where VRR became part of the official standard. So far, only Nvidia has support for HDMI 2.1 VRR on its Turing and upcoming Ampere GPUs, which is used on LG's 2019 OLED TVs. That will likely change once AMD's 'Big Navi' GPUs are released, and we expect full HDMI 2.1 support from Nvidia's Ampere GPUs as well. If you own a Turing or earlier generation Nvidia GPU, outside of specific scenarios like the LG TVs, you're generally better off using DisplayPort for the time being.
One major advantage of HDMI is that it's ubiquitous. Millions of devices with HDMI shipped in 2004 when the standard was young, and it's now found everywhere. These days, consumer electronics devices like TVs often include support for three or more HDMI ports. What's more, TVs and consumer electronics hardware has already started shipping HDMI 2.1 devices, even though no PC graphics cards support the full 2.1 spec yet. (The GeForce RTX 3070 and above have at least one HDMI 2.1 port.)
HDMI cable requirements have changed over time, just like DisplayPort. One of the big advantages is that high quality HDMI cables can be up to 15m (49.2 feet) in length — five times longer than DisplayPort. That may not be important for a display sitting on your desk, but it can definitely matter for home theater use. Originally, HDMI had two categories of cables: category 1 or standard HDMI cables are intended for lower resolutions and/or shorter runs, and category 2 or “High Speed” HDMI cables are capable of 1080p at 60 Hz and 4K at 30 Hz with lengths of up to 15m.
More recently, HDMI 2.0 introduced “Premium High Speed” cables certified to meet the 18 Gbps bit rate, and HDMI 2.1 has created a fourth class of cable, “Ultra High Speed” HDMI that can handle up to 48 Gbps. HDMI also provides for routing Ethernet signals over the HDMI cable, though this is rarely used in the PC space.
We mentioned licensing fees earlier, and while HDMI Technology doesn't explicitly state the cost, this website details the various HDMI licencing fees as of 2014. The short summary: for a high volume business making a lot of cables or devices, it's $10,000 annually, and $0.05 per HDMI port provided HDCP (High Definition Content Protection) is used and the HDMI logo is displayed in marketing material. In other words, the cost to end users is easily absorbed in most cases — unless some bean counter comes down with a case of extreme penny pinching.
Like DisplayPort, HDMI also supports HDCP to protect the content from being copied. That's a separate licensing fee, naturally (though it reduces the HDMI fee). HDMI has supported HDCP since the beginning, starting at HDCP 1.1 and reaching HDCP 2.2 with HDMI 2.0. HDCP can cause issues with longer cables, and ultimately it appears to annoy consumers more than the pirates. At present, known hacks / workarounds to strip HDCP 2.2 from video signals can be found.
DisplayPort vs. HDMI: The Bottom Line for Gamers
We've covered the technical details of DisplayPort and HDMI, but which one is actually better for gaming? Some of that will depend on the hardware you already own or intend to purchase. Both standards are capable of delivering a good gaming experience, but if you want a great gaming experience, right now DisplayPort 1.4 is generally better than HDMI 2.0, HDMI 2.1 technically beats DP 1.4, and DisplayPort 2.0 should trump HDMI 2.1. The problem is, you'll need to buy a TV rather than a monitor to get HDMI 2.1 right now, and we're not sure when DP 2.0 hardware will start shipping (RTX 40-series maybe).
For Nvidia gamers, your best option right now is a DisplayPort 1.4 connection to a G-Sync display. If you buy a new GeForce RTX 30-series card, however, HDMI 2.1 might be better (and it will probably be required if you want to connect your PC to a TV). Again, the only G-Sync Compatible displays out now with HDMI 2.1 are TVs. Unless you're planning on gaming on the big screen in the living room, you're better off with DisplayPort right now. Ampere supports HDMI 2.1 but sticks with DP 1.4, and G-Sync PC monitors are likely to continue prioritizing DisplayPort.
AMD gamers may have a few more options, as there are inexpensive FreeSync monitors with HDMI available. However, DisplayPort is still the preferred standard for PC monitors. It's easier to find a display that can do 144 Hz over DisplayPort with FreeSync, where a lot of HDMI FreeSync displays only work at lower resolutions or refresh rates. HDMI 2.1 meanwhile is only supported on the latest RX 6000-series GPUs, but DisplayPort 2.0 support apparently won't be coming for at least one more generation of GPUs.
What if you already have a monitor that isn't running at higher refresh rates or doesn't have G-Sync or FreeSync capability, and it has both HDMI and DisplayPort inputs? Assuming your graphics card also supports both connections (and it probably does if it's a card made in the past five years), in many instances the choice of connection won't really matter.
2560x1440 at a fixed 144 Hz refresh rate and 24-bit color works just fine on DisplayPort 1.2 or higher, as well as HDMI 2.0 or higher. Anything lower than that will also work without trouble on either connection type. About the only caveat is that sometimes HDMI connections on a monitor will default to a limited RGB range, but you can correct that in the AMD or Nvidia display options. (This is because old TV standards used a limited color range, and some modern displays still think that's a good idea. News flash: it's not.)
Other use cases might push you toward DisplayPort as well, like if you want to use MST to have multiple displays daisy chained from a single port. That's not a very common scenario, but DisplayPort does make it possible. Home theater use on the other hand continues to prefer HDMI, and the auxiliary channel can improve universal remote compatibility. If you're hooking up your PC to a TV, HDMI is usually required, as there aren't many TVs that have a DisplayPort input (BFGDs like the HP Omen X (opens in new tab) being one of the few — very expensive! — exceptions).
Ultimately, while there are specs advantages to DisplayPort, and some features on HDMI that can make it a better choice for consumer electronics use, the two standards end up overlapping in many areas. The VESA standards group in charge of DisplayPort has its eyes on PC adoption growth, whereas HDMI is defined by a consumer electronics consortium and thinks about TVs first. But DisplayPort and HDMI end up with similar capabilities. You can do 4K at 60 Hz on both standards without DSC, so it's only 8K or 4K at refresh rates above 60 Hz where you actually run into limitations on recent GPUs.