Skip to main content

The Meta 2 Augmented Reality Dev Kit, Hands On

This year’s Game Developers Conference (GDC) was brimming with virtual reality devices and games, but there was room for some augmented reality (AR) companies to make their mark at the show, too. One such company was Meta, which showed off its Meta 2 AR HMD.

The demo was held in a dark room at the W Hotel, which is right across the street from the Moscone Center. The HMD was initially hidden underneath a thin black cloth, which Ryan Pamplin, Meta’s vice president for sales and partnerships, whisked away like a magic trick to reveal the device. After putting it on my head for a few minutes for calibration, Pamplin walked me through some of the device’s features.

Things To See And Hold

The demo was split into a series of short segments. To start, I saw a large, 3D moving image of the Earth, which showed off the HMD’s 2560x1440 resolution. For the most part, the image was crystal clear, and I could see the details of the various clouds moving across the planet. However, the border in the middle of the HMD’s screen broke the illusion, as certain parts of the planet didn’t align in the center of my view.

Moving on, I also tried my hand (pun intended) at playing with digital objects with my physical hand. A virtual basketball was placed in front of me, and I put my hand underneath it. As I pulled my hand away, the ball fell onto the physical table and bounced a few times before I caught it again. I was surprised by the low latency when I stopped the basketball in mid-air. However, the hand tracking wasn't completely accurate, as a portion of the ball sank into my hand.

The Meta 2 also allows you to watch movies and videos. A short video played in front of me while Pamplin joked that other TV companies are jealous of Meta’s viewing capabilities because of the lack of bezel on the virtual screen. Indeed, the video screen was akin to a borderless window. I could also look at it from different angles, so you could (in theory) share the same screen with multiple HMDs at the same time.

A Productive Workspace

The developers also believe that the HMD could be a potential replacement for your office workspace. Another demo showed a series of tabs on a web browser. By clicking on one page, I could open it and “grab” it in space with both of my hands. I could also use my hands on either corner of the open window to expand its size. In essence, I could have multiple pages open and around me at the same time. With the help of a Bluetooth-connected keyboard, I could also type in one of the pages, which had Google Docs loaded.

Online shopping could also improve in AR. Pamplin showed me an Amazon page for a pair of Nike shoes. By “clicking” on the image with my hand, the shoe popped out of the page and turned into a 3D model, which I could manipulate with my hands. Obviously, in order for this to work with other brands, Meta will need to work with throngs of Amazon vendors.

Perhaps the most interesting demo was video calling. Pamplin showed me a direct stream of one of his colleagues in another room in the hotel. However, Pamplin took the feature one step further and brought his colleague into my virtual world, at least from the shoulders up to his head. The image quality wasn’t the best (I could barely make out his facial features due to connection issues), but it was astounding to see that video calling was possible in virtual space.

The final demo showed a detailed 3D diagram of one of SpaceX’s satellites. Pamplin then told me to grab the satellite with my hands and turn around. In front of me was the moving image of Earth again. I placed the satellite in orbit and watched it move around our little blue planet while the "Sunrise" fanfare from "Also Sprach Zarathrustra" played in the background.

Quite The Experience

I took off the Meta 2, astounded by what I saw in the various demos. My only experience with AR prior to the Meta 2 was Microsoft’s Hololens at last year’s E3. However, it seems that Meta is a step or two ahead of Microsoft in terms of development.

The main problem I had with Hololens was its miniscule field of view. Pamplin told me that the Meta 2's field of view was increased to 90 degrees. (This was an oft-requested change by most of the users who tried the first Meta prototype.)

Even though it’s considered to be a dev kit, you can pre-order it for $949. The high price, however, seems somewhat justified, as the Meta 2 comes with an HD camera, a depth camera, a six-axis inertial measurement unit (IMU), four speakers and a sensor array for hand tracking.

Just like virtual reality in the past few years, augmented reality is still in the developmental stage. However, devices like the Meta 2 show that AR isn’t just a figment of the imagination. It’s actually coming to life.

Follow Rexly Peñaflorida II @Heirdeux. Follow us @tomshardware, on Facebook and on Google+.

  • naturesninja
    This is why I'm not jumping on the VR train just yet.
    Reply
  • beetlejuicegr
    The VR train is now on coal, i will definitely jump when it is like those super fast trains in japan ehhehehhe
    Reply
  • kyle382
    This is why I'm not jumping on the VR train just yet.

    lol oook...think it might be a little hardcore to wait for refined AR and the software to match?
    Reply
  • naturesninja
    This is why I'm not jumping on the VR train just yet.

    lol oook...think it might be a little hardcore to wait for refined AR and the software to match?

    I've been around the industry a LONG time. VR is not ready for mainstream, and is being pushed out too early in my opinion. It is far from a mature technology, and I doubt it will last long in it's current form before it gets replaced with something that smart developers will focus their time on. AR just makes more sense in the real world. HUD tech has already given paths to AR for many years, and has allowed people like myself to already develop applications for such technology. So no, I don't have to wait.
    Reply
  • DonGateley
    Did the resolution of the overlay look any different than any other 1440 line display?
    Reply
  • joneb
    The only way AR will make VR redundant is if it can give a full immersive world experience or better than VR including a much higher field of view with no leak of the real world environment such as light and even sound. That is the whole point of immersion in a VR environment.Is that going to happen anytime soon?
    Reply
  • braneman
    I'm really looking forward to AR stuff, I've seen it done on like the 3ds with those card things so the technology is mostly there(I really liked that you could create a hole in something via one of those QR code lookalikes that I can't remember the name of) Also I think this will require a lot less power than VR does because it doesn't need to be 1080p for each eye and 90 FPS to prevent motion sickness.

    What I think it would do best for would be some kind of virtual sandbox game, not an actual sandbox but like a literal sandbox with sand that you just AR onto a desk. Or maybe like warhammer, it would be so awesome to set down some stuff on a table and AR in a game of warhammer around your office furniture or something.
    Reply
  • kyle382
    17721743 said:
    This is why I'm not jumping on the VR train just yet.

    lol oook...think it might be a little hardcore to wait for refined AR and the software to match?

    I've been around the industry a LONG time. VR is not ready for mainstream, and is being pushed out too early in my opinion. It is far from a mature technology, and I doubt it will last long in it's current form before it gets replaced with something that smart developers will focus their time on. AR just makes more sense in the real world. HUD tech has already given paths to AR for many years, and has allowed people like myself to already develop applications for such technology. So no, I don't have to wait.

    I don't think many people will question the statement "VR is being pushed out a bit early" or that "it is far from a mature technology". If you know about some affordable AR hardware with some fun applications I hope you will share with us. I am not being sarcastic, please share.

    Reply
  • bit_user
    What I really want to know is whether the synthetic content shown by the Meta 2 HMD has the same depth of field as objects in real space at the same distance.

    Maybe an easier question to answer, in retrospect, would be the range of depths at which the synthetic content shown in the Meta 2 HMD had. Was it a fairly narrow range, or did they demo things ranging from a couple feet all the way to 100 feet?

    If the device has a fixed depth of field, as I'm guessing, then they'll probably restrict the depth of synthetic content to a fairly narrow range. This will limit the sort of applications it can support.
    Reply