Testing Nvidia's Multi-Res Shading In Shadow Warrior 2

There are a couple ways to improve graphics performance. Either you equip GPUs with more resources, adding brute force, or get creative and come up with efficiency optimizations. Nvidia's Multi-Res Shading technology is a good example of the latter.

Originally intended for VR, calculating one scene from multiple viewports, Multi-Res Shading found another application: improving the gaming performance of graphics cards by reducing the rendering quality of certain portions of the image. Shadow Warrior 2 is the first non-VR title to benefit from this functionality. According to Nvidia, Multi-Res Shading raises performance by around 30 percent, without much degradation to visual quality. And that's exactly what we want to test: how does graphics quality hold up, and is there a quantifiable performance gain?

Shadow Warrior 2: MRS In Action

If you own either a GeForce GTX 900-series (Maxwell) or 1000-series (Pascal) card, it's possible to activate Multi-Res Shading in Shadow Warrior 2. Two options are available: Conservative and Aggressive. In the first case, the resolution along the border of the image is reduced by 40% verses 60% in the second. The size of this zone is also different according to the option you choose: 20% or 22% to the left and right, and 18% or 20% to the top and bottom of the image.

Certain effects (like lens flares) are also diminished, or even completely removed. This is one of the faults you could rightfully criticize about this technology. Resolution isn't the only attribute affected by MRS; certain details also seem to be treated differently by the GPU's PolyMorph Engine, which is responsible for handling geometry.

Besides this, there are differences between Maxwell and Pascal, too. Nvidia added a “Simultaneous Multi-Projection” block to Pascal (PolyMorph Engine 4.0), while Maxwell must be content with its “Viewport Multicast” block. In practice, Multi-Res Shading should theoretically be less efficient on Maxwell than Pascal.

MORE: All Gaming Content

Multi-Res Shading: Screen Captures

The following screen captures show several examples with MRS activated. Each trio of screen captures appears in the following sequence: MRS off, MRS low, and MRS high.

MRS OffMRS OffMRS LowMRS LowMRS HighMRS High

Lens flare effects are deactivated as soon as MRS is turned on. Numerous shading and lighting effects disappear as well, and aliasing appears clearly on the image's borders when we use the most aggressive MRS mode. Check the blade of the sword, for example. The contours of the image are more blurry, especially with MRS turned up.

MRS OffMRS OffMRS LowMRS LowMRS HighMRS High

  

Here again, lens flares disappear as soon as MRS is activated. The halo of light level with the hole in the cave, towards the upper-left, is only visible if MRS is turned off. The borders of the puddle of water are blurry when MRS is cranked up. The mud stain just in front of the bridge partially disappears, even though it is located at the center of the image and therefore shouldn't be affected at all.

MRS OffMRS OffMRS LowMRS LowMRS HighMRS High

  

The difference with respect to reflections on the body of the vehicle in the foreground is flagrant, with MRS imposing inferior quality. The ambient lighting of the scene is lower with MRS active, and the character to the right is pixelated when we use the aggressive MRS mode.

MRS OffMRS OffMRS LowMRS LowMRS HighMRS High

  

By removing numerous lighting effects (reflections, refractions, diffractions, scattering…), even at the center of the frame, MRS noticeably degrades the visual quality of this scene and strongly diminishes ambient lighting. The hologram to the right is particularly pixelated in the higher MRS mode.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Test Setup

Multi-Res Shading clearly has a visible impact on visual quality, especially in aggressive mode. But its primary goal is to improve performance. Let's see if this is the case. We ran our tests on the game's first scene using Ultra quality at 1080p, 1440p, and 4K using the following three graphics cards:

Software
Graphics DriversNvidia GeForce Game Ready 375.76
Operating SystemWindows 10 x64 Enterprise 1607 (14393.351)
Storage ControllerIntel PCH Z170 SATA 6Gb/s

Benchmark Sequence

Shadow Warrior 2 - Multi-Res Shading

1920x1080

If Multi-Res Shading truly has an impact on frame rate, the performance without Nvidia's feature is already good enough that perceived smoothness is excellent in all cases. In other words: at 1080p, Multi-Res Shading is not useful with these mid-range cards.

2560x1440

Here too, the average and minimum frame rate soars, regardless of the setting we specify. But activating MRS allows our three cards to never dip below 50 FPS, which is apparent in terms of perceived smoothness. MRS finds a real use at this resolution.

3840x2160

It is at 4K that the impact of Multi-Res shading is greatest, be it in frames per second, frame time stability, or perceived smoothness. Whatever the case, our three mid-range GeForce cards are not able to sustain a frame rate sufficient for the game to be smooth at this very high resolution, which is rather normal.

Conclusion

While Multi-Res Shading has a negative impact on visual quality that cannot be ignored, at least in this specific implementation, performance gains attributable to the feature are quantifiable. The improvement is supposed to be greater with Pascal than Maxwell, but our measurements don't appear to support this: the GeForce GTX 970 benefits very well (except in 4K) with the same visuals.

Our only disappointment is that the technology as it's exposed in Shadow Warrior 2 isn't happy with just degrading the peripheral image's resolution. It also removes some components, which also affects the scene's quality right in the middle of your screen, where it's supposed to be untouched. In the end, it is far from the screen captures provided by Nvidia.

This option will come in useful in specific cases, though. Take our results at 1440p, for example. A resolution like 1080p doesn't tax mid-range graphics cards enough to make the visual degradation worthwhile, and the GeForce GTX 970 and 1060 aren't fast enough to play at 4K, even with MRS. In the end, we recommend against using the more aggressive MRS mode because it impacts graphics quality too negatively for our liking.

MORE: Best Deals

MORE: Hot Bargains @PurchDeals

This thread is closed for comments
20 comments
    Your comment
  • SockPuppet
    I honestly don't know how I feel about this tech. Hey, great. you figured out a way to downsample anything the user isn't directly "looking at". But everyone is different and unique.
  • alextheblue
    Might be acceptable in VR if it made the difference between "playable" and "I'm feeling nauseous". But outside of that, no way. Too distracting. Also, it's another proprietary Nvidia technique. Why not just take it all the way and make a vendor-specific API, call it Glide Two.
  • coolitic
    "Why not just take it all the way and make a vendor-specific API, call it Glide Two."

    You mean like Mantle? I sure hope you aren't saying that Nvidia would be doing something bad and AMD wouldn't. Nvidia's proprietary tech wouldn't be so popular if game devs had some competency and made their own similar tech.
  • Jim90
    What's worrying/suspicious for this tech is that image quality at or near the screen centre gets degraded, certainly in examples above. This is something the tech is not meant to do.

    The principle is great and should be applauded but clearly needs some fine-tuning. We'll be using ideas like this all the time once proper focal rendering becomes the norm - both in VR and non-VR.

    There will never be a need to render at the fullest fidelity something outside the operator/gamer's focal area. Apart from spectators/commentators!
  • michaelzehr
    It's likely that things on the border that might affect the visual appearance in the center could be addressed. (Perhaps by not reducing the processing fidelity for things that emit light or with certain types of reflections -- after all there must have a way of processing them if they're completely out of frame too.)

    What I'd be interested in is a blind test of this. In other words, do you notice the sword tip changing during play if you don't know you should look at it?
  • thor220
    So in otherwards, it benefits 2k and 4k most but it degrades the quality to the point where you might as well just keep it at the lower resolution anyways.

    Nvidia need to seriously improve this tech before it can be considered an option.
  • Joe Black
    So its basically a crafty mipmap'esque scheme right?

    Personally I found that my eye was drawn to the bits of the scene as they go from low to high, or high to low res shading. With the MRS off the stability of the entire scene is obvious. And in the stills it is clear that the degradation of quality starts very close to the viewpoint. That poor tree in the third set of stills is completely screwed up by the technique.

    It might make sense in VR, especially with eye tracking and sensible thresholds, but for 2.5D - Nope.
  • Joe Black
    The changes are too obvious. In the video my eyes kept jumping to the points in the scene where the shading resolution was changing, much like with good old mipmap where we are all used to seeing trees pop as we come closer. In the stills it was obvious just how badly quality is affected - That poor tree in the 3rd set of stills... Eish...
  • alextheblue
    746565 said:
    "Why not just take it all the way and make a vendor-specific API, call it Glide Two." You mean like Mantle? I sure hope you aren't saying that Nvidia would be doing something bad and AMD wouldn't. Nvidia's proprietary tech wouldn't be so popular if game devs had some competency and made their own similar tech.


    Most of AMD's stuff either starts open source or becomes open source. Mantle became Vulkan. Nvidia's proprietary tech remains closed source and often exclusive to Nvidia hardware or crippled on competing hardware. Also your assessment about game dev competency is only partially true. Nvidia actually pays developers to use GameWorks, for example. Small developers in particular enjoy the cash. So yeah, if you step back and look at the big picture, Nvidia does some questionable stuff on the software side all the time. Their hardware is generally good though.
  • huilun02
    I don't think this is a good idea even if limited to VR. We have eyeballs that move and can look all over the screen. They aren't pinned to the dead center of the screen where they will never notice the detail elsewhere. And as the final graph shows, MRS has a greater benefit the more underpowered your GPU is for the display resolution. "Omg a 27% increase in performance" yup if you're using a 1060 to play at 4k...
  • Dryparn
    This in combination with a eytracker like Tobii EyeX could be where this technology shines. If you could only render in high quality where you actually look you could save a lot of power or get higher frame rates.
  • renz496
    5190 said:
    746565 said:
    "Why not just take it all the way and make a vendor-specific API, call it Glide Two." You mean like Mantle? I sure hope you aren't saying that Nvidia would be doing something bad and AMD wouldn't. Nvidia's proprietary tech wouldn't be so popular if game devs had some competency and made their own similar tech.
    Most of AMD's stuff either starts open source or becomes open source. Mantle became Vulkan. Nvidia's proprietary tech remains closed source and often exclusive to Nvidia hardware or crippled on competing hardware. Also your assessment about game dev competency is only partially true. Nvidia actually pays developers to use GameWorks, for example. Small developers in particular enjoy the cash. So yeah, if you step back and look at the big picture, Nvidia does some questionable stuff on the software side all the time. Their hardware is generally good though.


    meh it happen on both sides actually. the only difference is AMD usually whining more when they were hit with bad performance.
  • alextheblue
    2392160 said:
    This in combination with a eytracker like Tobii EyeX could be where this technology shines. If you could only render in high quality where you actually look you could save a lot of power or get higher frame rates.
    Only if it works so fast you can't tell. You don't want a delay where blurry parts come into focus a moment after you shift your eyes around. That would be REALLY disconcerting. But the theory is interesting, and if they could pull it off it would essentially be free performance. I have my doubts, however.

    2392160 said:
    meh it happen on both sides actually. the only difference is AMD usually whining more when they were hit with bad performance.
    Example? At least Cool gave an example, though I disagree with him. The development of Mantle spurred MS into action and Khronos too (they absorbed and expanded Mantle into Vulkan so now it is open). The vast majority of these shenanigans are on Nvidia's side. Why? AMD's rendering techniques are open. Nvidia gets full access and can optimize for it to their hearts content. Nvidia's stuff is closed and proprietary. It either works on Nvidia only (such as this Multi-Res Shading and PhysX which runs in software mode on non-Nvidia) or it runs poorly on AMD hardware (GameWorks to varying degrees depending on what they implement). When Nvidia experiences poor performance, it's usually something about the developer's engine. Not some AMD proprietary middleware.
  • renz496
    5190 said:
    2392160 said:
    This in combination with a eytracker like Tobii EyeX could be where this technology shines. If you could only render in high quality where you actually look you could save a lot of power or get higher frame rates.
    Only if it works so fast you can't tell. You don't want a delay where blurry parts come into focus a moment after you shift your eyes around. That would be REALLY disconcerting. But the theory is interesting, and if they could pull it off it would essentially be free performance. I have my doubts, however.
    2392160 said:
    meh it happen on both sides actually. the only difference is AMD usually whining more when they were hit with bad performance.
    Example? At least Cool gave an example, though I disagree with him. The development of Mantle spurred MS into action and Khronos too (they absorbed and expanded Mantle into Vulkan so now it is open). The vast majority of these shenanigans are on Nvidia's side. Why? AMD's rendering techniques are open. Nvidia gets full access and can optimize for it to their hearts content. Nvidia's stuff is closed and proprietary. It either works on Nvidia only (such as this Multi-Res Shading and PhysX which runs in software mode on non-Nvidia) or it runs poorly on AMD hardware (GameWorks to varying degrees depending on what they implement). When Nvidia experiences poor performance, it's usually something about the developer's engine. Not some AMD proprietary middleware.


    there is this misconception that open source will make it running equally better on all hardware. that is not always the case. take DX12 itself. why async compute impact nvidia hardware in negative way? because the feature is mainly build the way it work on AMD hardware. not matter how nvidia try to optimize it async will always result in negative impact on kepler and maxwell. the changes with pascal is just so they don't take performance hit from it. but in general they still did not benefit from it. vulkan while it is a bit better on nvidia the end result we still see the most improvement is on AMD hardware. why? because Mantle first and foremost is develop to cater AMD hardware which what is vulkan based on.

    and i said earlier both company are pretty much doing the same thing. only AMD are more vocal about it if something bad happen to them when they also try to do the same in the past. take Dirt Showdown for example. the game use AMD proprietary tech that time called forward engine used in global illumination in the game. if you look at the benchmark at that time the game simply run terrible on fermi and kepler hardware. but you never see nvidia raging to the public about how amd partnered with developer to implement feature specifically to sabotage their performance like AMD accuse nvidia are doing with CDPR. in Dragon Age 2 case for example nvidia was restricted from having the access to the game until the game officially being launch. but nvidia never blame it on AMD as why they are not getting access despite AMD are doing marketing cooperation with bioware/EA for the game.

    people said nvidia being jerk or anti competitive by locking PhysX to their hardware only. but many people did not know that in the past nvidia are pretty much open to make PhysX to work natively on AMD hardware. there are group of people try to make it happen. and nvidia was readily to help them. but it is AMD that simply reject the idea and don't want to make it happen.

    https://www.techpowerup.com/64787/radeon-physx-creator-nvidia-offered-to-help-us-expected-more-from-amd

    http://www.ngohq.com/news/14254-physx-gpu-acceleration-on-radeon-update.html
  • norune
    Interesting findings.
    I'd personally not use this feature since it's definitely removing more than the value of what it adds.
  • Jeff Fx
    It's a shame about the shortcomings. SW2 is a fun, weird game that suffers from some serious frame rate drops on a GTX 970. I'm sure a lot of people could benefit from this technology if they can improve it.
  • ohim
    Mantle was open, it transcended into Vulkan that is also open, Nvidia has Gameworks that is not open in any way, TressFX AMD gave access to the code to Nvidia as well, can`t say the same about PhysX or other stuff so don`t compare the two of them like being the same.
  • xapoc
    Good tech for lower end systems. Also if I cant hit 60fps on ultra at 4k I would "consider" enabling MRS and have fluid game play.
  • Jorge Nascimento
    So failvidia keeps throwing software....
    Guess the non Asynchronous hardware support and dx12 FAILS, got them running after their green tails.....
    They need to get more game ready software stuff to fool the software makers and the consumers.....
  • thisisnotthereason
    If this could be improved some more, this could be a step closer towards rendering different groups of texture objects in different resolutions, instead of just assigning lower resolution to bars at the sides.

    However it is not as bad as it appears in these images.
    On this site we are comparing pictures directly, by looking at screenshot's.
    Nitpicking on those passive photo's.
    But it is not meant to be compared like that.
    A moving picture already, greatly reduces the awareness of the effect.
    When i stopped focussing on it, and just started playing.
    I could barely notice it while running and jumping around.
    Slicing up monsters, i simply did not have time to even try to notice it anymore.

    It can increase FPS by a noticeable enough amount, to almost ignore the small degrade 60% has in Shadow Warrior 2.
    It might prove less usefull in different kind of games.
    But this game is meant to be played as a fast shooter / slicer.
    And i would like to see this option in more of these kind of games during this generation of cards.

    They should however tweak it in such a way that the degrade on the center is completely fixed, before it really works as intended.