Nvidia GameWorks VR Multi-Res Shading And Other Parlor Tricks

As the countdown continues to serious, first generation, flagship-level virtual reality (see: HTC/Valve Vive, Q4 2015; Oculus Rift, Q1 2016), the companies making the serious graphics processing hardware to power it all are jockeying for pre-eminence, each taking turns laying claim to the rendering brawn in the forgotten boxes conveniently hidden for those mind-blowing VR demos, and then touting the sophisticated brains still required to eke out every last microsecond of latency. Better that minds are blown than chunks, as it were.

We know plenty about the Nvidia GeForce and AMD Radeon brawn. A little more now (see our Nvidia GeForce GTX 980 Ti review), and even more soon enough (Fiji bound, anyone?). As for the brains, late last year Nvidia provided details on VR Direct, early this year AMD rolled out Liquid VR, and it all sounded eerily similar, suggesting perhaps a strong bit of guidance from the likes of Oculus.

Now comes Nvidia's GameWorks VR, a collection of technology initiatives for VR game developers and headset makers to make VR games perform better. It's a subset of the overall Nvidia GameWorks development tools and libraries.

Nvidia has provided several details behind GameWorks VR, specifically related to image manipulation. For those who've tried on a VR headset, you know that it is, for all practical purposes, a completely immersive experience with a very wide field of view, accomplished by a spherical warping of light onto your eyes, pushing out focus to about 20 feet to ensure there's no eye strain. 

A warped image cannot be rendered natively, so the GPU warps the image first, squeezing the pixels on the edge of the image, while ballooning out the center, effectively gobbling up processing power only to compromise some of the fidelity of the original image geometry.


To overcome this, Nvidia employs multi-resolution shading, which is another way of saying that GPU cycles aren't spent rendering pixels you'll never see anyway. At a high level, Nvidia's technology compresses the outer edges of the image. It does so by dividing the image into nine regions, called viewports, each of which is instructed to render at a particular resolution and to remember that scaling factor. The center image (the sweet spot) might be rendered 1:1, but for the scaling at the edges changes the GPU does less work, and the performance rises. And you're none the wiser. 

For those prone to sound smart at parties, this bit of optimization using viewports makes use of Maxwell's multi-projection acceleration geometry (specifically the second-gen GM2xx GPUs). The result is delivering just enough pixel density at maximum compression, just before the warped image is visible. Nvidia claimed that it can get up to twice the pixel shader performance improvement by doing so, and that the quality difference is imperceptible. 

I was able to see some of this in action. Nvidia cranked up the compression on an image little by little, turning compression on and off for before and after views. I paid special attention to the edges of the image, and in most cases I really couldn't see any difference. It was only when Nvidia turned compression up past around 50 percent that I could see a visible blur and shake at the outer edges, something Nvidia is continuing to improve upon.

Between now and the launch of products such as the HTC/Valve Vive and the Oculus Rift, expect to see more updates from both AMD and Nvidia. Nvidia's VR Direct, like AMD's Liquid VR, includes enhancements such as Asynchronous Timewarp, which is different than the warping that multi-resolution shading provides. Michael Antonov, chief software architect at Oculus, provided a pretty good explanation of it and its downfalls here, and we discussed how it works in our overview of AMD's Liquid VR here.

Nvidia's VR Direct also includes VR SLI so that multiple GPUs can update each eye separately, but also in sync; that is, the frames aren't rendered in an alternating fashion, for hopefully obvious reasons. VR Direct also makes use of Maxwell's DSR and MFAA. And all of these settings are delivered automatically with GeForce Experience.

When consumer headsets do ship, there will be game-ready drivers, or optimized game settings, including SLI profiles for VR, all delivered through GeForce Experience. These settings are specifically customized to a user's hardware.

Follow Fritz Nelson @fnelson. Follow us @tomshardware, on Facebook and on Google+.

Fritz Nelson
Fritz Nelson is Editor-at-Large of Tom's Hardware US.