One of Nvidia’s big announcements at GTC 2016 is its new Iray VR, which is basically the VR-enabled version of its original Iray plugin for 3ds Max, Maya, and other 3D modelling software. The original plugin is what you’d use to render photorealistic images of your models, but of course that’s not sufficient in VR – in VR you want to be able to turn your head and look around, and Iray VR enables you to render still scenes built in 3D modelling applications for viewing through a VR headset.
For GTC, Nvidia has demos of IRay VR, Iray VR Lite, and the Iray VR Lite demo in real-time.
The simplest version, Iray VR Lite, renders a spherical stereoscopic panorama, and you can look all around from that one specific point. The benefit of Iray VR Lite is that post-rendering, it has relatively low requirements. The catch is that you can’t move your head around--that is, you cannot move your head through 3D space.
I got the chance to experience a demonstration of Iray VR Lite on an HTC Vive, and in this demo Nvidia had a handful of pre-defined head positions. I could look around, and in the distance there would be a glowing green orb. Using the controller, I could aim at it and trigger a teleport. With that, I was able to maneuver through the upcoming headquarters, Endeavor. It was quite an impressive demo, but lacking the ability to move my head around did feel, well, lacking.
I asked how big a frame buffer such a demo required, and Nvidia explained that for the demo it is quite large, particularly because of the glowing orbs and the use of uncompressed images. When the final release comes, approximately an 8 GB frame buffer should be ample for most demos.
Nvidia also demonstrated a live demo of Iray VR Lite, although for all intents and purposes, this demo was very much of the “because we can” variety. The demo was run on Nvidia’s VCA (Visual Computing Appliance) with 32 nodes in Santa Clara, with the output transmitted to the convention center through a very fast Internet connection. The demo showed a BMW Z4, and I could change its color, interior detailing, rims, and the environment it was situated in. Like the VR Lite demo. With the photorealistic graphics, it really did feel like I was in a BMW Z4 when I selected the driver's view.
In the demo, it was clear that the system in Santa Clara was working very hard to render the scene. As it first came up, the image was still quite fuzzy, and only sharpened after a few seconds. As I looked around, I could also see the scene being built up around me, with the areas not yet rendered shown simply as white space. Calling it "real-time" is a bit of a stretch.
The full version, Iray VR is more capable, but it has a serious performance cost. It enables you to move your head around within the scope of what Nvidia calls a "light field." This demo has five degrees of freedom: You can move your head through three dimensions, and then you can look up and down or side to side (the other 2 dimensions). Iray VR calculated the light rays you would see from each direction at every point in space within the light field. Additionally, for the demo, Nvidia also enabled tone mapping, so as I moved from looking at a dark part inside the building to looking out of a window, the lighting changed.
Of course, you can see where this gets difficult: Rendering a scene like this generates tremendous amounts of data. Therefore, the demo also only works best on a graphics card with a huge frame buffer. For that reason, Nvidia used the 24 GB Quadro M6000 for the demos. Fortunately, you do not need a large amount of GPU compute power to display a rendered scene--just the enormous frame buffer.
Iray VR Lite will be available as early as June. Availability for the full version of Iray VR with 5 degrees of freedom will be announced this spring.
(Note that except for the camera-shot image, the renders shown above are from the existing version of Iray, not Iray VR.)
Follow Niels Broekhuijsen @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.