HDR Monitor Buyers Guide: Should You Dive In?

We’ve already explained in depth what HDR or “High Dynamic Range” is, but here’s what you need to know when you’re shopping for an HDR-equipped display.

HDR Hardware And Standards

The first thing you’ll need, no matter what, is a display that can support HDR content. And because we simply can’t review every HDR display coming out, knowing the basics of what to look for can at least point you in the right direction.

Standards support is the first thing to note. HDR standards are centered primarily around how metadata is read for each frame. This data tells the screen what sort of brightness the signal is supposed to represent and how to display content like the creators intended. Unfortunately, standards are ever-changing and not necessarily cross compatible, and this probably won’t change in the near future.

The two primary standards are HDR10 and Dolby Vision, with the former being the more common. These two standards tell your screen what to display, so without them you’re not getting the correct image (if you’re getting anything at all).

The differences between HDR10 and Dolby Vision are a tangle of marketing and technical specifications. This is not to mention HDR10+, which is an improved version of HDR10. Hopefully devices will all be updated to this standard, but that’s not guaranteed. It updates HDR10’s static range of lighting for a single file such that you get changes in the range over time. Thus, one scene could have a range of 1-1000 nits while the next scene in a movie could instead have a range of 0.1-500 nits instead. (This is similar to how Dolby Vision works).

So which one should you choose? Fortunately, the most relevant thing to know is that HDR10 seems to be the dominant encoding for HDR content because it’s free and open. Therefore, any display or device (such as a Blu-ray player) that supports HDR10 should give you what you want, at least at the most basic level.

It doesn’t seem like HDR standards will actually be settled anytime soon, though. Not everyone in the industry is on board with HDR10+, and it isn’t necessarily future proof. Another issue is Hybrid Log Gamma, which is yet another standard that's used for broadcast TV. Who knows what other standards may crop up in the future. In short, ensuring that HDR content shows up correctly on your HDR display without totally worrying over a bunch of technical details is not in the cards particularly soon.

HDR Displays

Aside from HDR10 support, there are plenty of other things to look for in a display. A detailed primer on understanding how well each display actually accomplishes all these needs is beyond the scope of this article, but we’ll review the basics to at least steer you in the right direction. We’ll cover some of the more important aspects to understand, such as brightness and contrast ratio, in subsequent sections, but the rest of the items are below.

Color gamut is a big concern. This is the half of HDR that expands the amount of colors you can see compared to today’s “SDR.” For this purpose, the standard today is the P3 gamut. Often, you can also find a “% of gamut covered” in the technical specs for most displays. The closer to 100% of the gamut the better, and really, you’ll want 100% if at all possible. In the future, the larger and more colorful REC 2020 gamut is supposed to be supported, but right now only expensive laser projectors support it.

Bit Depth is another basic feature you’ll want to pay attention to. It determines how many bits of information are used by the display to output your image. For actual HDR content, you’ll want a 10-bit+ display, because getting any less while trying to display a higher range of brightness or more colors can result in visible banding (12-bit displays are theoretically better but are rarely available). Technically, you’ll see displays touting “8bit + FRC”. FRC here stands for framerate control, which dithers over time. Theoretically, this may look as good as a 10-bit display, so make sure your display supports one or the other.

HDR OS And Device Support

Another question to ask is if your OS supports HDR, whether that OS is on your PC or your phone or your Roku. The good news here is that Android, iOS, and Windows all have HDR support. Increasingly, so do purpose-built devices like the newest Apple TV, certain Roku models, and certain FireTV models. The bad news here is the OS support isn’t exactly fully fleshed out at the moment.

Windows 10 offers HDR support, but the results are incredibly finicky and odd. Getting HDR to work on Windows 10 at all can be a headache, as you’ll have to make sure your display is connected correctly, and that all the options are set correctly.

Both iOS and Android are already set up to use the display on your phone, because it’s a closed system, unlike how your display is a peripheral that attaches to your PC. Further, iOS and Android simply let each individual app decide whether it’s HDR compatible, and the systems let them figure out what settings they should be using at the moment. For example, when you open up the Youtube app on your smartphone, you should just see an “HDR” option pop up under the quality settings on compatible videos.

Unfortunately, Android doesn’t perform quite as well as iOS in that regard, at least at present. Not all HDR-compatible phones have all HDR compatible apps. The LG G6 and Razer phone, currently the only two handsets to get Netflix in HDR, can’t use the Youtube app in HDR. Nor does Android come with a standard video player that can play HDR videos, meaning you’ll have to hope your phone manufacturer installed their own on your device.

It’s a bizarre set of circumstances for owners of HDR phones, as compatibility should be easy. In fact, it’s so easy that you can grab a hacked Youtube app for HDR-compatible phones, and it just works pretty much perfectly. Hopefully, this situation will be fixed soon enough.

HDR Content

So now that you’ve got your HDR screen, and some device that can put HDR content on that screen, you’ll actually have to have content to view on your fancy new viewing platform. And although the headaches end here, the actual selection of HDR content may leave you wanting.

Streaming and downloadable HDR video are perhaps the easiest to get your hands on. Netflix, iOS, Google Play Movies, Youtube, and more all offer HDR content now, and they all make it relatively easy to find. Netflix will have a little Dolby symbol on the actual series/movie page. iOS similarly puts a nice little HDR symbol next to your purchasable media, telling you that it’s supported. As for Youtube, the drop-down video quality menu will have a set of HDR resolution options along with the normal quality options.

Although it’s easy to identify whether or not content on those platforms is HDR, there just isn’t all that much of it available yet. On Youtube, for example, you’ll mostly find specially made videos from just a handful of video makers at the moment. On iOS, you’ll be able to purchase the same HDR-capable movies and series you might find on Blu-ray, but those are still relatively few. All of Netflix’ own series are available in HDR, but getting it to work can be a hassle.

Supposedly, this author’s LG G6 works with Netflix HDR, but actually getting it to work seems to be another thing entirely. Not only that, but other movies and series that are available in HDR elsewhere simply aren’t on Netflix. Planet Earth 2, which is spectacular in its use of the technology, is available on Netflix--but not in HDR.

There’s also Ultra HD Blu Rays, which can support HDR as well as the “Ultra HD” 4k resolution. Not that the moniker guarantees HDR content on the disc, nor even 4k necessarily--it just states that it’s possible for the format to support both HDR and 4k. That means you’ll have to double check for yourself whether what you’re buying supports both, or either. Further, you’ll need a Blu-Ray player actually capable of playing the HDR format, as older Blu-Ray players can’t.

Then there’s video games, which again is a whole different story. The good news is that video games have for a while now mapped their lighting to a higher range than SDR can display in order to simulate eye adaptation and other effects. This, in turn, means some games can remap their brightness outputs to HDR relatively easily. This can even be done retroactively with titles like Hitman (2016) and The Witcher 3.

The bad news for video games comes in the form of trying to support higher color gamuts. In order to save memory, games already squeeze textures and the SDR (REC 709 gamut) pretty hard. Supporting more colors such as the P3 gamut means that more power (bandwidth) and memory will be taken up--and that’s after developers tackle the challenge of making P3 gamut content in the first place.

Unlike movies, games have always been on the older, CRT-based color range. Thus, in order to support more colors in Link’s sword, for example, you’ll have to purchase new equipment like monitors that support the P3 gamut, new software that can output those colors, and a plethora of other headaches. In other words, don’t expect full support for HDR content in PC games for a while.

On Brightness And Contrast

High brightness is critically important to HDR displays. Although contrast ratio is as well, it’s far from the only thing that should concern you when you’re buying a display.

As we discussed in a previous article, “What is HDR?”, humans perceive brightness in a log2 scale. In other words, every time “real” brightness doubles, we perceive it as going up one “stop” or point on a linear scale. For example, object A could be 32x brighter than object B, but we’d see object A as only 5x brighter than object B.

More practically, this has huge implications for how much contrast we can actually perceive on a display. In theory, a good OLED display can have infinite contrast, and can thus hit HDR’s desired full contrast ratio of 20,000 to 1 with ease (14.5 stops). That’s more than twice the range of SDR in “stops” (about 7, it’s complex), yet in practice, the amount of light reflecting off your screen severely limits how much range you can actually see.

For example, if you're sitting indoors in daytime, a large window near you might produce ambient light of 1K nits. A screen capable of producing a high-end 800 nits and with an extremely low reflectance of only 1% of the surrounding light will thus not have a black level of 0. Instead, your black level will be that 1% of light reflected back at you, with 1k nits coming in you’re getting a black level of 10 nits coming back, not 0. This means we end up with 6.3 stops of actual contrast--in other words, only SDR range. And considering that the very, very highly rated screen for the iPhone X is rated at around a 5% reflectance, even this scenario is beyond most current technology.

Even in a much darker room with 1% reflectance, say 200 nits ambient with the blinds closed, you’d still be getting only about 8.6 stops of contrast, which is much less than the target of 14.5. If your display could hit the full target of 10k nits, you’d be getting 12 stops of contrast in this scenario, which is fairly close to the full range HDR is meant to hit. And for reference, the engineering committee for HDR recommends an HDR viewing environment of just 5 nits, which is a dark editing room with the lights off. Overall, then, the brighter your screen the better, unless you plan to view your screen exclusively in a blacked-out room.

This also has implications for just how much contrast you might need. A 1,000:1 contrast ratio might not sound impressive next to a 20,000:1 contrast ratio, but practically speaking, this is 10 stops of contrast versus just over 14, which is not nearly as much of a gap as you might intuit. And as we stated above, 10 stops is probably beyond what you can achieve in most scenarios today.

This leads us to question the hype around per-pixel dimming, or the idea that you can achieve more contrast by turning on and off pixels or sections of the screen. OLEDs can turn off each pixel completely, and therefore the contrast ratio can theoretically be infinite. Relatedly, some LCD screens are equipped with “zone” backlights that can turn on and off to increase the contrast ratio between different areas of the screen. But without reaching far beyond today’s peak nits output and making screens much less reflective than they are at present, neither of these technologies are all that useful unless, again, you are sitting in a blacked-out room.

So although the contrast advantage for local dimming might not do a whole lot in most scenarios, it is a great help for overall brightness. Currently, screen brightness is limited by power output, and by lowering their power output to certain parts of the screen, other parts can get brighter. This is why only OLEDs such as Samsung’s Galaxy products and Dell’s UP2718Q with a lot of zones can hit 1k nits of brightness, even if it’s only on a small amount of the screen.

A Big Oversight

There’s one last thing to cover before we sum up: HDR brightness values for HDR10 and Dolby Vision are fixed points decided at creation time. If the creator sets the brightness of a pixel to 500 nits, it gets output at 500 nits regardless of viewing conditions. This is to say, presently there’s no automatically adjusting screen brightness for HDR. This can lead to poor visibility if the developer doesn’t choose their values right, and they’re guessing at these values as it is.

Automatically adjusting the brightness of the screen has been one of the most useful monitor features developed over the past decade. It means your screen can be visible during the day, doesn’t blind you at night, and saves battery whenever possible. But right now, regardless of device, the screen will automatically turn as bright as possible for HDR content, which is understandable, but it will also display content at the pre-designated levels of brightness, whether you actually see it or not.

This seems a ridiculous oversight by all device and OS makers. Surely, in an HDR video or image, the person who made the content intended for the entire thing to be visible. But today, darker parts of an image can be washed out in bright light, and unlike many SDR devices, there’s no turning up the screen brightness to view it, because the output is purposefully controlled by the creator, not the user. But the creator can’t sit there and make sure the darker parts of their image are viewed ideally; the output they control is just as fixed for them as it is for the end user.

One solution, that the Hybrid Log Gamma (HLG) standard already accounts for, would be to allow HDR content to be tonemapped--that is, adjusted to the current viewing conditions so it can be shown to the user as ideally as possible given their specific device and current lighting conditions. Hopefully, OS providers will be taking notes for the future, which leads us to...

Buying An HDR Display Today

The question after all of the above is whether you should even buy into HDR today. The answer to that depends quite a bit on what you want to get out of it. Regardless, there are several rules that apply to anyone looking to buy an HDR display.

The first rule is to make sure your screen actually covers the DCI-P3 gamut and has the 8bit FRC or 10bit panel to reproduce these colors accurately. As the most visible part of most HDR screens today, this feature is something that you should consider foremost.

For example, although Dell’s new XPS 13 advertises an upgraded “HDR” screen, you’re getting only the SDR color gamut, and alongside the screen’s low brightness, you’re not getting much in the way of “HDR” at all. Lenovo’s X1 Carbon does somewhat better, but even then it doesn’t cover close to 100% of the P3 gamut. Some “HDR” displays may not work altogether, such as Samsung’s Galaxy Book 12”. Meanwhile Dell’s “HDR” U2518D only uses an 8bit (non FRC) output. Therefore, buyer beware when it comes to laptops, or any screen for that matter.

When it comes to brightness--well, the brighter the better. But right now, it’s rare to find a screen that can put out 1k nits, let alone the 10k nits that was supposed to be the target for HDR. Sony had an 8k, 10,000 nit demo TV at CES, but who knows when anything like it will be put into production. In other words, buying an HDR display that’s bright enough to matter in most viewing scenarios simply isn’t going to happen today.

For contrast ratio, it depends entirely on where you plan to view your content. If it’s outside, in a bright room, or even in a semi dimly lit room, then ultra-high contrast ratios aren’t going to do a whole lot for you. If you can view your screen regularly in, say, a very dark room at night, then you’ll be able to get something out of your ultra high contrast ratio screen. But even then, remember that a 2,000:1 contrast ratio isn’t much different from a 20,000:1 contrast ratio.

Gamers will have a harder time of it. Not only are HDR monitors rare, but HDR panels cause more input lag than normal, which is definitely not good for fast-paced gaming. Further, although both AMD and Nvidia have their respective Freesync (2) and Gsync HDR tech that is supposed to bypass the input lag issue, buying monitors that support it is another matter.

AMD users have the decently reviewed Samsung CHG series, but nothing else has been announced; hopefully with Microsoft bringing Freesync 2 to Xbox, though, more manufacturers will support it. For Nvidia users, more has been promised, but so far nothing has been delivered. For example, the Asus ROG HDR G-sync monitor has been in the works for a while, and has just gotten an incredibly high price. And the ultra vague Gsync Tvs don’t appear to have great specs, upon closer scrutiny. That Nvidia has refused to support AMD’s free and open Freesync standard in favor of its proprietary and costly G-Sync is simply a hindrance to consumers of both vendors at this point, especially considering today’s almost nonexistent HDR sync availability.

So, should you get an HDR display today? Well, if you really want the latest tech, and can appreciate a much wider range of colors while watching Blue Planet II, then the answer is probably yes. But if you’re looking for something different, such as a future-proof monitor you won’t want to replace for a while, then the answer is almost certainly no.