Nvidia’s Turing Architecture Explored: Inside the GeForce RTX 2080

Variable Rate Shading: Get Smarter About Shading, Too

In addition to optimizing the way Turing processes geometry, Nvidia also supports a mechanism for choosing the rate at which 16x16 blocks of pixels are shaded in different parts of a scene to improve performance. Naturally, the hardware can still shade every single pixel in a 1x1 pattern. But the architecture also facilitates 2x1 and 1x2 options, along with 2x2 and 4x4 blocks.

Full-rate shadingFull-rate shading

Content-adaptive shading, color-codedContent-adaptive shading, color-coded

Content-adaptive shading, final outputContent-adaptive shading, final output

Nvidia offers several use cases where variable rate shading is practical (you don’t want to apply it gratuitously and negatively affect image quality). The first is content-adaptive shading, where less detailed parts of a scene don’t change as much and can be shaded at a lower rate. There’s actually a build of Wolfenstein II with variable rate shading active. In it, you can turn on the shading rate visualization to watch how complex objects aren’t affected at all by this technology, while lower-frequency areas get shaded at a lower resolution. A number of intermediate steps facilitate multiple rates. We must imagine that game developers looking to exploit variable rate shading in a content-adaptive manner will prioritize quality over performance. Still, we’d like to see this enabled as a toggleable option so third parties can draw comparisons with the feature on and off. 

Motion-adaptive shading is another interesting application of Nvidia’s variable rate shading technology, where objects flying by are perceived at a lower resolution than whatever subject we’re focused on. Based on the motion vector of each pixel, game developers can determine how aggressively to reduce the shading rate and apply the same patterns seen in the content-adaptive example. Doing this correctly does require an accurate frequency response model to ensure the right rates are used when you spin around, sprint forward, or slow back down.

Again, Nvidia presented a demo of Wolfenstein II with content- and motion-adaptive shading enabled. The performance uplift attributed to variable rate shading in that title was on the order of ~15%, if only because Wolfenstein II already runs at such high frame rates. But on a slower card in a more demanding game, it may become possible to get 20%+-higher performance at the 60ish FPS level. Perhaps more important, there was no perceivable image quality loss.

Although Nvidia hasn’t said much about how these capabilities are going to be utilized by developers, we do know that its Wolfenstein II demo was made possible through Vulkan extensions. The company is working with Microsoft to enable DirectX support for Variable Rate Shading. Until then, it'll expose Adaptive Shading functionality through the NVAPI software development kit, which allows direct access to GPU features beyond the scope of DirectX and OpenGL.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
25 comments
Comment from the forums
    Your comment
  • siege19
    "And although veterans in the hardware field have their own opinions of what real-time ray tracing means to an immersive gaming experience, I’ve been around long enough to know that you cannot recommend hardware based only on promises of what’s to come."

    So wait, do I preorder or not? (kidding)
  • jimmysmitty
    Well done article Chris. This is why I love you. Details and logical thinking based on the facts we have.

    Next up benchmarks. Can't wait to see if the improvements nVidia made come to fruition in performance worthy of the price.
  • Lutfij
    Holding out with bated breath about performance metrics.
    Pricing seems to be off but the followup review should guide users as to it's worth!
  • Krazie_Ivan
    i didn't expect the 2070 to be on TU106. as noted in the article, **106 has been a mid-range ($240-ish msrp) chip for a few generations... asking $500-600 for a mid-range GPU is insanity. esp since there's no way it'll have playable fps with RT "on" if the 2080ti struggles to maintain 60. DLSS is promisingly cool, but that's still not worth the MASSIVE cost increases.
  • jimmysmitty
    904774 said:
    i didn't expect the 2070 to be on TU106. as noted in the article, **106 has been a mid-range ($240-ish msrp) chip for a few generations... asking $500-600 for a mid-range GPU is insanity. esp since there's no way it'll have playable fps with RT "on" if the 2080ti struggles to maintain 60. DLSS is promisingly cool, but that's still not worth the MASSIVE cost increases.


    It is possible that they are changing their lineup scheme. 106 might have become the low high end card and they might have something lower to replace it. This happens all the time.
  • Lucky_SLS
    turing does seem to have the ability to pump up the fps if used right with all its features. I just hope that nvidia really made a card to power up its upcoming 4k 200hz hdr g sync monitors. wow, thats a mouthful!
  • anthonyinsd
    ooh man the jedi mind trick Nvidia played on hyperbolic gamers to get rid of thier overstock is gonna be EPIC!!! and just based on facts: 12nm gddr6 awesome new voltage regulation and to GAME only processes thats a win in my book. I mean if all you care is about is your rast score, then you should be on the hunt for a titan V, if it doesn't rast its trash lol. been 10 years since econ 101, but if you want to get rid of overstock you dont tell much about the new product till its out; then the people who thought they were smart getting the older product, now want o buy the new one too....
  • none12345
    I see a lot of features that are seemingly designed to save compute resources and output lower image quality. With the promise that those savings will then be applied to increase image quality on the whole.

    I'm quite dubious about this. My worry is that some of the areas of computer graphics that need the most love, are going to get even worse. We can only hope that overall image quality goes up at the same frame rate. Rather then frame rate going up, and parts of the image getting worse.

    I do not long to return to the day where different graphics cards output difference image quality at the same up front graphics settings. This was very annoying in the past. You had some cards that looked faster if you just looked at their fps numbers. But then you looked at the image quality and noticed that one was noticeably worse.

    I worry that in the end we might end up in the age of blur. Where we have localized areas of shiny highly detailed objects/effects layered on top of an increasingly blurry background.
  • CaptainTom
    I have to admit that since I have a high-refresh (non-Adaptive Sync) monitor, I am eyeing the 2080 Ti. DLSS would be nice if it was free in 1080p (and worked well), and I still don't need to worry about Gstink. But then again I have a sneaking suspicion that AMD is going to respond with 7nm Cards sooner than everyone expects, so we'll see.

    P.S. Guys the 650 Ti was a 106 card lol. Now a xx70 is a 106 card. Can't believe the tech press is actually ignoring the fact that Nvidia is relabeling their low-end offering as a xx70, and selling it for $600 (Halo product pricing). I swear Nvidia could get away with murder...
  • mlee 2500
    4nm is no longer considered a "Slight Density Improvement".

    Hasn't been for over a decade. It's only lumped in with 16 from a marketing standpoint becuase it's no longer the flagship lithography (7nm).
  • TMTOWTSAC
    In a perfect world, the non-RT models would be based off the TU architecture without any of the RT silicon, and priced accordingly. They're claiming RT is the must have feature and subsequently worth the price premium. Given those claims it's going to be very interesting to see what pricing scheme they go with for the non-RT models.
  • mlee 2500
    Great article, very informative, thank you for taking the time to write it.
  • dimar
    No need to waste your hard earned money. AMD Navi is around the corner. And if Navi isn't that good, RTX prices will be lower by then. With AMD you get freesync which most monitors have these days.
  • Reynod
    Fantastic read as always Chris.

    Objective, with warts ... an easy read ... informative ... with detail.

    I hope you are editing the article that gets released here with the benchies once the NDA is lifted.

    I will spend money based on that content ...
  • cangelini
    Thanks guys.

    Yes, I will be spending a long caffeine-fueled weekend with graphics cards, Excel, and Word. Let me know if there are any specific requests on comparisons you'd like to see made!
  • truerock
    I've been running my Nvidiia Geforce GTX 690 for 6 years. It does 3840 x 2160 at 30fps.
    The lack of HDMI 2.1 is just enough of a negative to keep me from buying a Geforce RTX 2080 Ti.
    I guess it is ironic that I actually don't want HDMI or DisplayPort outputs on my Nvidia cards. I want Nvidia cards that only have USB-C output ports.
    Oh well - maybe next year. My Nvidiia Geforce GTX 690 will be 7 years old.
  • truerock
    Chris,

    Thanks for the review. It's the best I've seen on these cards so far.

    I'm interested in 3840 x 2160 at 120fps. That would be with the more popular games. What settings for a specific game allow 3840 x 2160 at 120fps vs 3840 x 2160 at 60fps and 3840 x 2160 at 30fps. I'm not interested in g-sync. Does graphics quality suffer much as settings are pushed down to allow higher frame rates?
  • bit_user
    134065 said:
    Let me know if there are any specific requests on comparisons you'd like to see made!

    Crysis @ 4k? ...you know someone will ask it. And Anandtech tested it on the Titan V, so we can compare.

    https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/8
  • cangelini
    Before they did that, I did this: https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329.html ;)

    Time's going to be tight, but I'll see if I can throw it on the test system.
  • Reynod
    I agree ... if you still have the Original Crysis game ... then answer "But will it play Crysis?".

    The original Badly coded game please?

    I imagine you will alsso have received a couple of iterations of drivers since receiving the card, so let us know how much improvement you found with these?

    Finally, when you finish can you pull the HSF off and let us know anything about the TIM you find?


    :)