Intel announced Project Alloy, a self-contained virtual reality HMD that uses dual RealSense cameras for motion tracking and to incorporate real-world objects into VR, making this more augmented or mixed reality. (Or "merged reality" in Intel-speak. The company also calls Project Alloy "Premium All-In-One VR." Because why not invent new terms.)
What Intel demonstrated with Project Alloy here at IDF were experiences that are best described as conceptual, including the hardware. Intel's intention was to paint a vision for the near future, where you aren't just moving through a virtual world, or just incorporating the real world, but also interacting with both (see video above, especially at the 2:02 mark).
[Update: August 17, 10:57 a.m.: A video of the performance depicted above and described next has now been added following this paragraph, under the assumption that seeing it for yourself might better bring this experience to life.] Prior to the IDF keynote, Intel showcased digital DJs (far right, above) mixing using the company's Skull Canyon NUC. The DJ was joined in concert by an HMD-wearing physical drummer (far left, above) on stage, who was virtually drumming, but also actually drumming digitally on what looked to be (to him in VR and to us on screen) real drums. The trio was completed by a musician (middle, above) using Intel's Curie technology along with its RealSense cameras to virtually play both piano and cello simultaneously with incredible nuance and control.
All three played together, re-creating Herbie Hancock's "Rockit" in some kind of merged reality that you just had to see to believe. I never clap at these canned dog-and-pony shows, but I had to work hard to resist on this one.
During another on-stage demonstration, an Intel representative wandered around a virtual room (multiple virtual rooms, actually) and could see Brian Krzanich, Intel’s CEO, when he got too close to him, thanks to the Intel RealSense cameras on the HMD. Krzanich was in full, realistic (slightly pixelated) video view within the virtual world.
The demonstrator could see his own hands in front of him, manipulating objects in the virtual environment. At one point he took some dollar bills out of his pocket, placed them on a gold cylinder spinning on a lathe, and sculpted the virtual gold with the dollar bills in a nifty physics simulation--an example of that so-called merged reality and the natural interaction Intel thinks will be possible with its sensor technology.
Under The Hood And Untethered
Indeed, Project Alloy includes some impressive hardware advancements. For example, there are no cables, because it is a completely self-contained computing platform, with integrated CPU and GPU (it's a Skylake CPU, likely a Core i5 or i7, Intel VR director Kim Pallister said during a later session).
The sensors allow six degrees of freedom (along three perpendicular axis and three rotational axis), and RealSense brings your hands into view such that they can become the controllers—in fact, Intel claimed that the dual RealSense cameras have full depth sensing and five-finger tracking. Intel didn't say which RealSense camera was being used, but Steven Bateman, an Intel engineer, said that when the dev kits start shipping, they will likely use the new RealSense 4.0 camera. Project Alloy will need only one of them.
Although an Intel representative said the company wasn't revealing any details about other controller support, Bateman said in his session that Intel sees the need for all types of controllers, and envisions controllers and hands working together, depending on the application. He admitted that hand tracking introduces processing latency, which some applications are unlikely to tolerate.
Bateman was a bit more cagey on FOV targets, saying that it was possible to go pretty far, but that Intel was still figuring out the balance between weight, battery life, performance, and quality experience.
The Middle Ground
Intel's sensors are a big part of Project Alloy. The company, in a bit of humble gauntlet throwing, called what it could do "multi-room scale movement and tracking" (as opposed to HTC Vive's simple "room-scale tracking"). The HMD's sensors use technology called visual inertial odometer and depth (VIO-D), and it can also use delocalization to recover the position of objects in an environment that it has already learned.
The RealSense cameras are also useful for creating physics and occlusion for augmented applications. In one demonstration, after using the RealSense camera to map and build out the details of a real table placed into the virtual world, the user dropped virtual blocks onto that "real" table rendering, and the blocks reacted as you'd expect them to, bouncing briefly and settling into place.
Like any project of this sort, Alloy is more reference kit/developer platform. It is physically huge, as you might imagine a PC strapped to your head would be. (We were not allowed to try it.) Intel calls these "all-in-one HMDs." For now, its function seems to be more about showcasing what RealSense cameras and a host of other Intel sensors can bring to the VR table, but Pallister spent his session outlining the case for mainstream VR--that is, not the entry-level VR we see with the likes of Samsung Gear VR, nor the high-end or enthusiast-level VR we've got with Oculus Rift and HTC Vive, but something that performs well and is available to the masses. This is what Pallister called "premium AIO" VR.
From Palister's presentation, it would be fair to assume that Intel is tackling how to take some of the more processor intensive tasks from the PC (and in particular the GPU) and offload them to the HMD. He talked about all of the work being done by Valve, Oculus and its graphics partners with techniques like barrel lens distortion, chroma correction, and asynchronous time warp, and subsequent context swaps, and all of the latency those techniques introduce. He speculated that much of that could potentially happen on the HMD.
He showed performance models using the Intel Skull Canyon performing much of this offloading. Even then, the math still doesn't quite add up, so there's still a great deal of work to be done, but it's good to know that Intel is investing in bringing VR requirements downstream a bit.
Intel has also been talking with Microsoft, which has its own augmented reality solution (Hololens). Next year Microsoft will roll out a Windows 10 update that will include the Windows Holographic shell, which will let you connect an HMD and run holographic experiences, according to Microsoft Windows and device group VP Terry Myerson. With that, you’ll be able to interact with standard Windows 2D universal apps, but also VR apps. Naturally this would be done with a Hololens, but also Project Alloy.
In one particular video example played during the IDF keynote, an HMD wearer interacted with her calendar for an upcoming trip, and was able to yank out a VR module into holographic/VR space and walk around her destination (in this case, ancient ruins).
Myerson said Microsoft is working on a common mixed reality specification for the Windows Holographic experience, and it would be revealed at WinHEC in Shenzen in December.
Intel said that in the second half of 2017, it would open source the Alloy hardware and also open up the RealSense APIs so that third parties can create platforms around Alloy, RealSense and Windows Holographic.
Finally, Intel is also inserting itself into the content creation side, and demonstrated some examples of its work in sports stadiums with 3D cameras mapping the entire space for 3D replay technology, whereby you can change your perspective on any scene of a sporting event. The company is also creating a production studio in Los Angeles, called Tech Experience Labs (TXL), built for creators, technologists and producers; its remit is to create entertainment experiences, and it will open in 2017, Intel said.
MORE: All VR Content
MORE: Virtual Reality Basics