First HDR10+ PC Game Arrives With 'Automatic HDR' Mode

HDR10+ gaming on Samsung
(Image credit: Samsung)

Samsung and game developer Nexon have announced a partnership to bring the first HDR10+ gaming experience to Windows PC users. Nexon's The First Descendant will be released as a free-to-play (F2P) title, with the open beta available for download from September 19. You will be able to see and hear more about the Samsung and Nexon HDR10+ partnership at Gamescom, which kicks off later this week.

HDR10+ was co-established by Samsung way back in 2018, and it was first announced to the public in 2021. Its key advance is adding a layer of metadata to the HDR10 signal for real-time communication between PC, screen and software - optimizing the display scene by scene, and frame by frame.

At long last, today's announcement heralds that the first HDR10+ game is coming soon. Hopefully, this is the beginning of a wave of titles supporting this standard on PCs, as it sounds rather convenient and carries the potential to iron out the undeniably clunky HDR support on Windows PCs.

According to Samsung, HDR10+ "ushers in a new era of gaming," as it provides the following features:

  • Deeper color, contrast and brightness
  • More accurate depiction of details in dark shadows and bright highlights
  • Automatic setup, which eliminates the hassle of adjusting numerous manual settings
  • Folds in gaming performance features like low latency and variable refresh rate support
  • Claimed to deliver consistent and reliable HDR gaming experiences across all HDR10+ Gaming displays

The key benefit of HDR10+ seems to be gaining all the niceties of HDR10, with real-time metadata, and some gaming performance optimizations thrown in, all done in a frictionless automatic manner.

All you need is HDR10+ compliant hardware and games which support it to enjoy HDR10+ experiences. On PC, that will likely mean an HDR10+ Gaming monitor from Samsung, like one of the Odyssey 7 series and above. Moreover, your graphics card will need an HDR10+ enabling driver. Nvidia GeForce users got support for the HDR10+ Gaming standard starting last November. Finally, some software that supports HDR10+ will be necessary, and that starts with The First Descendant - a third-person looter shooter which also boasts 13 playable characters, as well as graphics tech like UE5 Lumen & Nanite, DLSS 2 & 3, and more.

We are looking forward to the first reports of HDR10+ and The First Descendant at Gamescom, and during the open beta. As with most monitor technologies, you really have to see and experience them first-hand to get a measure of the benefits.

Mark Tyson
News Editor

Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.

  • Blitz Hacker
    They lost me @ F2P. All those games end up pretty much garbage. I think League is the only F2P game that is decent, provided you can deal with the toxic community.
    Reply
  • txfeinbergs
    Blitz Hacker said:
    They lost me @ F2P. All those games end up pretty much garbage. I think League is the only F2P game that is decent, provided you can deal with the toxic community.
    They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
    Reply
  • kerberos_20
    txfeinbergs said:
    They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
    samsung did it with amazon
    but its opensourced format, no royalty, dolby vision has like 2500$ annual fee for content creators xD

    hdr10+ adopters got quite bigger, so not only samsung TVs supports it, panasonic, toshiba, TCL and few others
    https://hdr10plus.org/adopters/
    Reply
  • bit_user
    For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?

    It gives me bad memories of HDMI Deep Color support, which nothing really seemed to use. Even though my TV and PS3 both had it, there was never any indication of it being used.

    On a related note, I managed to buy one blu-ray that had xvYCC and used my PS3 to play it on my TV. I could believe the color gamut was better, but I'd have probably needed to see it side-by-side with standard BT.709 to appreciate the difference.
    Reply
  • kerberos_20
    bit_user said:
    For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?

    It gives me bad memories of HDMI Deep Color support, which nothing really seemed to use. Even though my TV and PS3 both had it, there was never any indication of it being used.

    On a related note, I managed to buy one blu-ray that had xvYCC and used my PS3 to play it on my TV. I could believe the color gamut was better, but I'd have probably needed to see it side-by-side with standard BT.709 to appreciate the difference.
    most HDR stuffs are HDR10 or HLG, even windows is just plain HDR10, nvidia did driver support for both hdr10+ and dolby vision, amd is a little behind
    but still, how many dolby vision monitors are there? still even if dolby vision is on paper better than hdr10+, theres not ia single panel which fully support dolby vision in its full range, 10k nits? 68B colors? hmm on paper it looks cool

    hdmi deep color is just RGB full (full being PC monitor norm, limited being TV norm), on TV content you shouldnt notice difference as thats RGB limited content, but on PC it should be noticeable
    Reply
  • scottslayer
    The currently active Crossplay Beta will have run for about a year when the Open Beta comes out.
    All they are doing is taking away the formality of clicking the apply for invite button on Steam.
    Reply
  • emike09
    txfeinbergs said:
    They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
    HDR10+ and Dolby Vision are pretty much the same thing with tiny differences that don't matter. For example, while DV supports a 12-bit color depth and HDR10/10+ supports 10-bit, it's nearly impossible to tell the difference for moving content. Running 12-bit requires more processing power on both the GPU and the display. Professional color grading experts grading for final releases would want 12-bit during the grading process, but even in the theater, nobody is going to notice the difference if the final output is 10-bit or 12-bit. Even 8-bit looks good, as long as it's not dithered 6-bit to 8-bit conversion, that's sloppy and looks terrible and new cheap TV's are still selling with that.

    The important thing is that HDR10+ is open source and requires no loyalties to use. I'm all for open source. While Dolby Vision is better, by specs, you're just paying Dolby extra for something that doesn't matter in the real world. 8k displays are sharper than 4k displays, but literally every AV professional will agree that you don't need 8K for anything, and when watching content between the two, at a normal viewing distance, you cannot tell the difference.

    Ultimately, what separates "HDR" from HDR10+ and DV is dynamic metadata. This is where no game has ever come forward as a feature. Being able to dynamically and automatically adjust HDR content based on that very particular moment is a revolutionary addition to the HDR world of gaming, and something I very much welcome.

    An example of standard HDR in PC gaming is RDR2. No dynamic metadata. It's not the greatest implementation of HDR and requires proper tuning to get just a basic decent scene, but it's not the worst either. Some scenes absolutely shine with realism, brilliant bright skies, excellent color and contrast. Other scenes are muddled, grey, too dark, too bright, with terrible contrast. This is where HDR10+ or DV would fix things, but DV requires licensing and a display that supports it, and most devs would say "hard pass".

    One size does not fit all when it comes to HDR, and that's where HDR10+ and DV's dynamic metadata step in.
    Reply
  • PEnns
    "...and that starts with The First Descendant - a third-person looter shooter"

    Third person?? Yawn.

    Carry on.
    Reply
  • emike09
    PEnns said:
    "...and that starts with The First Descendant - a third-person looter shooter"

    Third person?? Yawn.

    Carry on.
    Many of us prefer 3rd person. Especially on big displays or if story and charactor development is the heart of content. This being F2P, I don't see the story being big, but there's no need to hate on a good 3rd person game. Unless the game is just bad.
    Reply
  • thestryker
    bit_user said:
    For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?
    I'm not aware of any monitors supporting HDR10+ or Dolby Vision which I've found interesting. They all seem to be limited to the VESA HDR specs which only mandates HDR10.
    Reply