AMD announced Liquid VR, a virtual reality initiative that takes the immediate form of a limited, alpha version SDK being seeded with select developers during GDC this week. The SDK provides access to a series of underlying processing advances (using AMD hardware, naturally) aimed at further tackling latency, device compatibility and increased content quality.
Raja Koduri, AMD's graphics CTO, called virtual reality "the next frontier for visual computing," saying today's announcements are really just the beginning, and that AMD wanted to drive virtual reality to full presence, or photo realism. To get there, he said, it will require "full sensory integration," and scalable CPUs, GPUs, and (hardware) accelerators, which will be a crucial component. AMD to the rescue, then.
The road to VR, Kaduri said, demands adherence to two key rules: don't break presence; and "if your CPU and GPU can't keep up, you throw up." The entire ecosystem, from the graphics stack to the driver to the peripherals and even the audio processing stack, must work harmoniously, he said. And to underscore this while announcing Liquid VR, AMD involved a variety of partners, including Oculus and Crytek.
While gaming stands to be the killer app, Koduri said that he was equally enamored with the possibilities in education, medicine, training and simulation, as well as big data visualization. In other words, AMD is investing some serious resources in virtual reality.
In particular, Liquid VR enables features including Latest Data Latch, Asynchronous Shaders, Affinity Multi-GPU, and Direct To Display. Much of this really re-imagines the way frames are rendered, with the goal to eke out every last millisecond of latency between head movement and visualization (or "motion to photons," to use AMD's phrasing).
For example, "latest data latch" ensures that when the GPU grabs or binds data, it's using the very last piece of information on head tracking before VSync begins. Layla Mah, AMD's head design engineer, said that the old way of doing things meant grabbing whatever data was present, old or new, and rendering it, but now developers can ensure the headset is capturing and displaying absolutely the most recent motion.
Shaders can now access asynchronous compute engines in GCN to process virtual reality images through the hardware in parallel with rendering. Because of this, Liquid VR enables asynchronous time warp, which is essentially the ability to re-project pixels to make them look the right way for your current head position. (It also lets you do things like ray tracing asynchronously.)
AMD's direct to display capability promises to give direct application control to the headset (through an AMD Radeon graphics card, of course), regardless of the headset provider, and outside of the headset SDK.
AMD's Kaduri indicated that support for the Razer-led OSVR initiative was premature, but that the company had taken a look at it.