GTC 2013: Hands-On with zSpace 3D System

During GTC 2013 in San Jose, I got a chance to stroll through the exhibit hall during a dead spot between sessions. One of the booths I visited was zSpace Inc which was showcasing its incredible zStation monitor. It caught my eye because the promotional material I received prior to my scheduled visit promised holographic 3D objects I could manipulate and pull straight up to my face.

It did not disappoint.

The zStation system comprises of three components: the monitor itself, a connected stylus and a pair of polarized passive specs. It requires at least 4 GB of RAM, an Nvidia Quadro 4000 or ATI FirePro GPU (or equivalent), and Windows XP (32/64-bit) or Windows 7 (32/64-bit). On the CPU front, it needs an Intel Core i7 clocked at 2.2 GHz or greater, or a Xeon E3 or E5. If AMD is your preferred taste, the Opteron 4200 series, FX-8xxx series, or the Phenom II X6 or greater CPU is recommended.

The monitor consists of a 120Hz 23.6-inch Full HD screen with a response time between 2.5-ms and 5.5-ms. A sensor is mounted in each top corner to keep track of where the pen resides between you and the screen, and two cameras mounted along the top edge to keep track of your face. These latter two cameras feed off trackable markers built into the super-lightweight glasses so the 3D scenery remains consistent no matter how you're viewing the virtual objects.

"[It features] full motion parallax," the company states, "with sensors tracking the viewing angle, enabling the user to look around objects and visualize multiple perspectives with simple head movements."

As for the laser stylus, it's connected directly to the monitor, and sports three physical buttons and an integrated infrared LED and accelerometer design. This is the only way you can interact with the holographic objects. Literally point the stylus at the object, click a button, and you can grab the object as if you've stuck a fork right in it.

The demo blew my socks off to say the least. In one demonstration, zSpace CTO Dave Chavez loaded up a heart and told me to pick it up. It sat in the virtual space, beating and pushing virtual blood through its virtual arteries like a piece of red pumping meat shifting on a white plate. I grabbed it with the stylus and literally picked it up and out into my own personal space. Whoa.

As one who has suffered heart issues since birth, this was a little… jolting. Here was a virtual heart beating literally in front of my face, and I could actually feel it pumping through the stylus. If I clicked on a texture covering the front of a chamber, I could see inside. I could even twist my hand as if that heart was mounted on a stick, and see all around the organ (orbit) and get a good look at all the valves from inside and out.

It's this type of technology that we need in doctors' offices. Being told what's wrong and how the problem will be fixed will no longer be such a mystery. And this information could be passed from office to office, hospital to hospital and viewed on multiple stations simultaneously by different experts who are jointly trying to solve an issue across great distances. What an incredible advancement.

In addition to the heart, I was allowed to check out a number of other demos. In one instance, I was tasked to stack up a set of blocks. In another, I could design a house and view each individual room in real-time by gripping a small virtual camera and pushing it through doors and windows. I could even make modification to a car, pick up a ruler and examine both sides, and more. Unfortunately, the visuals weren't completely realistic, but believable to the point where the 3D took over and made you forget some of the simplistic textures used in the models.

That said, Chavez loaded up a game using the Unity engine and that looked amazing. Again, the 3D effect won me over because it offered a level of immersion that stays true to your field of view no matter how you're sitting or standing. As an example, in one scene the character moved outside onto a platform with a banister, and I could step up on my tiptoes and look over, or squat down a little and see under it, peering out into the landscape beyond. It was incredible.

Building on that, with the heart demo, I could hold the pen and organ perfectly still, but move my body around letting me orbit the virtual meat and blood as if it actually hovered in one spot in midair. I'm not sure if I'm explaining this correctly, but the head tracking tech does an excellent job keeping the object as realistic in z-space as possible.

For 3D animators and modelers, this would be a much-needed purchase. I would even consider buying the system for home schooling. Heck, this would be an excellent addition to any classroom, ranging from middle school on up. It would make science class a whole new experience, or give chip developers a better way of visualizing and designing their PCB layouts.

"zSpace provides an immersive environment for professionals in many industries driven by the desire to create and visualize objects with realism in stereoscopic 3D," the company said during the conference. "Designed for individuals seeking the most responsive tools that go beyond conventional displays and input devices, zSpace allows users to complete complex tasks in a natural and intuitive manner."

The heart demo was part of the company's Anatomy application which offers additional organs to juggle. The company also demonstrated Nvidia's Particles demo which visualizes a large set of particles and simulates their physical interactions. On the zStation, you could literally pick up the ball and create collisions with particles in real time.

ZSpace is actually conducting a zCon developers conference next month from April 22 to April 23 in Mountain View, California. It's dubbed as an "educational environment that stimulates communication and collaboration among developers of 3D applications, hardware and content". The company has also released an SDK for those who want to create applications for this system, and a Maya plug-in for licensed Autodesk Maya users.

"I can see many use cases for zSpace that fit into Autodesk’s breadth of solutions and am looking forward to working on zSpace," said Brian Pene, Sr. Principal Researcher of Autodesk.

As for the price of the system, I believe Chavez said it was just over $1000 – I can't remember the exact number, but it was in the same ballpark as Google Glass. Keep in mind it doesn't ship with a PC – you get the monitor, stylus, glasses and software. For more information about "holographic computing" with the zStation, head here.

Contact Us for News Tips, Corrections and Feedback           

  • teaser
    So you need a develpors gpu to use this ?
    Reply
  • renz496
    teaserSo you need a develpors gpu to use this ?
    for what this device is intended for i think you're going to need professional gpu. but if you use it for learning purpose (such as in science class) like kevin mention above i think consumer grade gpu might be suffice for that
    Reply
  • jhansonxi
    or give chip developers a better way of visualizing and designing their PCB layouts.
    Chip developers don't do much in the way of PCB design and PCB designers generally don't work on chip designs. They're similar skills, use similar CAD software (sometimes from the same CAD companies), and have a large degree of interaction, but they are different engineering fields. The zSpace would be useful with solid modeling of PCB assemblies and the same within mechanical assemblies which is a very common need in embedded systems.

    Regarding the heart demo, this would be useful for modeling implants or planning surgeries.

    The Unity engine is becoming very popular. The Unreal engine could use some competition.
    Reply
  • Barton Fiske
    Regarding the GPU questions, here is a list of zSpace verified graphics cards:

    https://support.zspace.com/entries/21282368-zSpace-Supported-Graphics-Cards

    The key functionality is the ability to deliver a quad buffer stereo visual via DVI or DisplayPort

    Barton Fiske, zSpace
    bfiske@zspace.com

    Reply