Xreal's $700 Air 2 Ultra AR glasses put Apple Vision Pro and Meta Quest 3 in its crosshairs

Xreal Air 2 Ultra
(Image credit: Xreal)

I’ve reviewed several of Xreal’s augmented reality glasses over the past year, the most recent being the Air 2. However, the company is back again with a more feature-packed entry: the Air 2 Ultra. While the Air and Air 2 focused primarily on allowing you to enjoy content on a simulated 100+ inch virtual screen while still seeing your environment around you, the Air 2 Ultra adds six degrees of freedom (6DoF) tracking. 

Xreal first introduced 6DoF tracking with the Nreal Light, but the Air 2 Ultra puts more powerful hardware into a sleeker and more stylish titanium frame. Despite the lightweight frame, the Air 2 Ultra is a bit heavier at 80 grams compared to 72 grams for the Air 2. The Air 2 Ultra includes two new environmental sensors (cameras) embedded in the glasses-style frame (which Xreal claims are the smallest in the industry), allowing real-time tracking of the user’s position within a 3D space and even enabled hand tracking. This should allow for a more immersive AR experience that offers a mix of virtual content interspersed with real-world environments, which is possible with the Meta Quest 3.

The rest of the hardware is similar to what we’ve seen with the Air 2, which means you get a Full HD OLED display per eye, offering a maximum refresh rate of 120Hz and maximum brightness of 500 nits. However, you now get a 52-degree field-of-view, up from 46 degrees with the Air 2. When viewing content (such as games or movies) on the glasses, it’s like looking at a 154-inch screen at a distance of 13 feet. Trust me, it’s a trippy experience when you first try it. 

Xreal Air 2 Ultra

(Image credit: Xreal)

Other features include speakers integrated into the frames with directional audio to allow you to listen to music/movies/games without blasting those in your vicinity with unwanted noise. The glasses connect to your smartphone, tablet, or PC with a USB-C cable. That means that the Air 2 Ultra is natively supported by most Android devices and Apple’s most recent iPhone 15 family. Older iPhones can be used with the Air 2 Ultra by purchasing a separate adapter.

The company points out the $499 Quest 3 and Apple’s $3,499 Vision Pro as direct competitors. Both of those devices offer more powerful hardware and higher-resolution displays. They also provide a passthrough digital view of the outside world instead of a direct view of the outside world, as the Air 2 Ultra offers. Another big difference is that the Air 2 Ultra looks more like a regular pair of sunglasses, albeit a bit bulkier. The Vision Pro and Quest 3 quickly draw attention for their unorthodox designs.

Xreal is targeting developers and general consumers with the Air 2 Ultra, and the glasses will be available in March for $699. However, the company will offer a $100 discount for owners of the Nreal Light who preorder the AR glasses.

TOPICS
Brandon Hill

Brandon Hill is a senior editor at Tom's Hardware. He has written about PC and Mac tech since the late 1990s with bylines at AnandTech, DailyTech, and Hot Hardware. When he is not consuming copious amounts of tech news, he can be found enjoying the NC mountains or the beach with his wife and two sons.

  • bit_user
    Any comparisons between this and Apple's Vision Pro are superficial, at best.

    What truly sets apart Vision Pro is the sophistication & refinement of its algorithms. It can place virtual objects somewhat convincingly in the real world, which is extremely hard to do well. It not only requires accurate tracking and depth extraction, but also light source estimation. Furthermore, that requires a lot of compute power, which Vision Pro gets from its M2 SoC + R1 ASIC.
    Reply
  • hotaru251
    this article is just a mess.

    these are not competing with QUest 3. Quest 3 is a VR headset.

    these are more AR.

    and comapred to the apple ehadset...it liekly isnt even in same ballpark in AR for that.
    Reply
  • DavidLejdar
    hotaru251 said:
    this article is just a mess.

    these are not competing with QUest 3. Quest 3 is a VR headset.

    these are more AR.

    and comapred to the apple ehadset...it liekly isnt even in same ballpark in AR for that.
    The article just states what the company is aiming at.

    And Meta Quest 3 does have AR as well. So, someone looking for AR only or primarily, may be interested. E.g. to me personally, such glasses would be more appealing for use while traveling (in a train), i.e. to watch a movie or browsing (if the text is clear enough).

    And the Vision Pro may be better, but also a lot more expensive. And depending on what one wants to use it for, it may be way over the top. I.e. again personally, I got me a Pico 4 last week for VR. The resolution is just about the same as with Meta Quest 3, and I use my PC with that for stuff beyond e.g. Ultimechs and Moon Rider XYZ. So if I also go AR, I wouldn't need it to be able to run everything (on its own), and other factors would be relevant, such as the usability while traveling (and from the price difference I could easily get me a top GPU, and also haptic gloves).
    Reply
  • hotaru251
    DavidLejdar said:
    And Meta Quest 3 does have AR as well.
    i never said didn't. Just the focus of a VR headset is drastically different from a AR headset.

    theres an entire market to JUST AR glasses.

    Not one of em would say their competition is Quest 3.
    Reply
  • kealii123
    I just want an AR monitor that floats above my laptop screen. From all accounts, a static AR display will give you motion sickness, but is 6DoF necessary? Can it work with 3? Is the FoV on this enough? Anyone know?

    Eithe way, this 1080p per eye probably isn't sufficient with 6 dof.
    Reply
  • A296
    Nevermind the AR sensors or the processing power or anything else on the Xreal glasses. The first thing they need in order to be competitive in ANY way is to widen their FOV. I have a pair of the first gen nreal air and the fact that it maxes out at what is the equivalent of a 27 inch monitor 2 ft away from your face in terms of fov makes it impossible to put things in the extreme periphery of vision. The Q3 has a 100 degree fov, Vision Pro is estimated to be about the same iirc. These are increased to 52 which I assume also increases the vertical fov if they keep the aspect ratio of the screens the same. This is still going to require elements to be closer to center than either of the headsets that they supposedly target.

    I tried using their sidescreen mirroring and it was just too close to center. This is of course personal preference but I just don't see it being a good experience until they can widen the fov so things don't interrupt the front and center view.
    Reply
  • abufrejoval
    Warning: If you own a PC, Android or an non-asian male head, you may not want to buy anything from Xreal!With that out of the way, let's go deeper.

    I got curious enough from a former article like this to buy an Air² Pro 2 and I was seriously disappointed, should have really returned them, and didn't because I run some kind of curiosity cabinet as part of my job, which includes near misses, ...some of which aren't even that close.

    And Brandon, "crosshairs" imply even a remote possibility to hit, and here this is clearly not the case: your headlines for Xreal are misleading.

    So here are the details:

    Software
    PC software drivers are part of the product, but not included.

    Windows drivers have been promised, promised again "very soon now", been published as BETAs, but failed to arrive or work. I've tried with a vast variety of hosts from laptops with Alt-DP/USB3/4 ports to an RTX2080ti, one of the very few cards that supports an accelerated Alt-DP+USB port (see below for the longer story) and the Air² just fails at setup phase.

    The only operational mode that works with PCs is using the glasses as a monitor in one of two modes:
    1. THD at up to 120Hz with the displays for both eyes getting a mirror image
    2. 32:9 at up to 60Hz with the display split in the middle so each eye gets half

    That doesn't require any software drivers to work, but is a far cry from anything "augmented reality".

    That can be a useful thing to own and operate e.g. in a train, but that's not what Xreal is advertising.

    What exactly they are advertising is actually a little hard to fathom, because the material is mostly extreme detail rendering and beautified smiles, very little concrete features or explanations.

    But it seems to entail or at least include the ability to create a wall of displays in a virtual space around you with the ability to move between them by turning your head.

    That facility does not exist on a PC for lack of working software, even if it is advertised.

    That is supposed to change "real soon now" but never did judging from the resports on the Xreal forum, which I can only recommend you dive into before buying: not doing that beforehand was my main mistake.

    Hardware
    The first issue is IPD: its design point seems to be 60mm eye-to-eye and mine are 68mm apart, so the outer edges on both eyes are fuzzy. It may be ok for a movie, but it precludes desktop work, reading, writing, coding, browsing etc. because of the eye strain and the need to basically switch eyes on every line.

    IPD cannot be changed on either side and there is only one version of the glasses that, according to the data I was able to gather, would work well with Asian males and Western females: Asian females and Western males would require either two additional sets of glasses or adjustability for ergonomic desktop-augmentation/replacement use.

    What is adjustable is the riding height on your nose, but that may also not be enough for your head: that part worked for me, once I discovered how to adjust the angle of the temples.

    The display within the glasses covers the upper central portion of your vision and that area is far from transparent when the displays are off: before reality is augmented in any way, quite a lot of it is subtracted first. Demandinig 100% transparency in the active display area may be impossible at the budget of these devices, but the level of blocking is too high to walk around safely with the glasses put.

    Unfortunately that continues with the lower part of the glasses, which in theory should allow all of reality to pass through, unless you decide to turn it of in three levels with the "Pro" variant. Here simply having nothing at all or glasses that are really transparent would save some use cases.

    As things are, reality has little chance to pass through at the zero blockage setting nor does it get entirely blocked at the 100% setting: Xreal delivers an extra cover for a reason, which --unlike your typical shades--isn't something you'd mount or dismount while wearing the glasses.

    That pretty much eliminates one of the prime use cases I had in mind when I bought the glasses: using them as a secondary (and private) display during laptop work, or in fact while dealing with paper on a desk: far too little gets through and keeping the glasses clean would be a challenge with extended use.

    PC platform issues
    The Air requires a display port to feed the displays and a USB port to send back sensor (gyro/compass) data to the host who computes the spatial projections in active mode in a single cable.

    And in active mode the host is responsible to compute the spatial projection.

    Turns out there are practically no qualifying PCs that offer such a port and the graphics power.

    Only one of my many GPUs, an RTX 2080ti, offers a combined USB-(Alt)DP port, which should work with Xreal's beta drivers (it doesn't because the latest BETA drivers from June 2023 don't support the Air² "yet").

    None of the older and none of the younger dGPUs offer USB-C display outputs (which include the USB inputs), so Xreal doesn't get the sensor data is needs.

    The laptops that do offer Alt-DP and USB on their USB-C ports (which are also often Thunderbolt ports), do offer the required interfaces but then fail to deliver the necessary graphics power to create the virtual screen projections at the required performance.

    It's not that a giant performance seems required, because a current smartphone should be good enough. But from what I could gather from the sparse comments the (single?) Xreal deveoper posted in their forum, the typical Intel HD iGPUs do not qualify, so far they recommend an Nvidia GPU... which don't tend to have USB outputs.

    Android platform issues

    Google!

    Google's drive to turn the open source Android ecosystem into a locked down Apple clone is closing the doors on things like functioning display port outputs on Android phones.

    When Xreal started with Android 11, many modern Snapdragon based phones had both sufficient power and a working DP+USB3 port.

    Since then new Android releases and newer generation hardware have closed doors that Xreal depends on.

    There are no signs of that trend reversing and Android is disfunctional per design, another detail that Xreal fails to mention.
    That situation would require regulatory pressure and custom ROMs to improve.

    My personal judgement
    Xreal's Air² Pro, just like all prior generations are not a product today. And pushing out "new" generations of what is sold as a product, when they may be at best judged extra iterations of a beta, doesn't bode well for any of those beta iterations ever becoming useful at a product level.

    Xreal are heroes for trying to push the envelope who become the innocent victim of the giants persuing an ever more narrow and closed future for "their" platforms. But when they continue to push hardware that simply cannot perform at the level of their dreams to consumers, they turn from victim to villain simply by overselling.

    The usefulness of Xreal is extremely limited today. If that niche is big enough for you go ahead and make sure you have a return option to protect you.

    Otherwise I can only recommend you dive deep into the forum or stay away until they can deliver what you think they advertise.
    Reply
  • abufrejoval
    kealii123 said:
    I just want an AR monitor that floats above my laptop screen. From all accounts, a static AR display will give you motion sickness, but is 6DoF necessary? Can it work with 3? Is the FoV on this enough? Anyone know?

    Eithe way, this 1080p per eye probably isn't sufficient with 6 dof.
    I have the Air² Pro glasses (and quite a few VR headsets).

    Motion sickness AFAIK is caused by your in-ear balance sensors and your visuals not being aligned sufficiently well.

    In passive mode, where the Air simply acts as an extra display, that's not an issue as your brain pretty much treats it like dirt on your glasses that follows your head movement and is static relative to eye movement.

    In active mode, where the projected perspective should change with your head movement, the fact that you can still see the rest of your environment in the areas the glasses to not cover, should keep your brain happy: your laptop screen would ground you.

    But even with the Air fully blocking outside inputs (with the extra cover) I've not experienced any issue there, because the sensors are good enough not to require inside-out cameras just for orientation.

    I did not try to walk around in full block mode...

    The main issues with the Air for the use case you describe is that fact that they are not transparent enough in the lower part for a laptop display to get through: perhaps if your screen can throw a 1000 nits, it might work, but with all of my laptops (who can be far too bright at max settings otherwise), that same max setting which is uncomfortable without the Air is too dark with it.

    And that is for black-on-white word processing or browsing.

    And then there is the issue of software: the floating display require a GPU to perform the projection in the virtual space. From what little Xreal is posting about why Windows drivers are delayed, laptop iGPUs are typically not powerful enough to do that job at sufficient performance.

    If you use the extra Beam box you'll only get a single display and at 70Hz refresh, but it shows that the quality of the sensors in the Air is good enough to provide perfect orientation without any perceiveable drift: if the hardware driving the glasses is fast enough, motion sickenss shouldn't be a concern.

    I've walked around with the Beam connected to my phone to feed video and spatial orientation of the screen was extremely stable even with (relatively) rapid head movements.

    The main issue is that the glasses simply are subtracting too much reality before they do any augmentation: they are too intransparent where they should be fully so.
    Reply
  • abufrejoval
    Xreal glasses are purely passive displays and sensors without local processing or electrical power.
    They are either two side-by-side screens with THD@60Hz or one THD mirrored screen at up to 128Hz operated from a Display Port input on a USB-C connector.

    They then include gyros/magnetic field/acceleration sensors for spatial orientation (Air/Air²) and for the Ultras and the Nreal variant they also include cameras: all of these require a USB3 capable back-channel on the same USB-C port whence they receive display output.

    The host is resposible for everything, from electrical power to all the magic you see on-screen. So potentially the abilities are quite huge and capable of growing over time, ...as you replace the host.

    For all the other problems and limitations please see my previous post, which is about the Air² which lacks the camera inputs.

    I won't believe Xreal's gesture recognition claim until I've seen it work and measured just how much host power it consumes.

    I own a Magic Leap, which uses a discrete set of infra-red time-of-flight sensors to generate a 3D digital twin of your hands for the ultimate real-time 3D interactivity.

    Somewhat similar to LIDAR on cars that creates a pixel cloud 3D box input which at a much higher level of abstraction allows to create gesture data using a skeletal machine learning model underneath than when you try the same from (flat) inputs, even if you could use stereo vision. And yet it took quite a bit of host processing power to make that work to what was still a rather limited degree.

    Magic Leap needs a good view at your fingers to create a matching digital twin at good quality. So when they offered a special holder to combine with them with my Oculus DK2 headset I was very enthusiastic... until I realized that the top-down view from the headset on my hands mostly obscured the fingers: there aren't terribly many jobs you do with your fingers facing you...

    For a long time the industry believed that you'd need both, 3D and 2D cameras and sensor fusion to generate quality digital twins of your environment for orientation, gesture recognition etc. and I don't know just how many sensors and cameras of each type Apple wound up putting into their "glasses".

    But you can be sure that Xreal requires replicating at least that computing power, perhaps even more if they need to infer sensors via ML that they don't physically have and that requires a huge and well integrated software stack on top of hardware most laptops, the only PCs with the proper ports an Air requires, don't have.

    So if working gesture recognition at a certain level of detail and precision is your expected use case, make sure you wait until they can demo it working before you buy.
    Reply
  • kealii123
    It seems xReal's real business plan is to sell B2B

    https://www.engadget.com/the-asus-airvision-m1-is-a-wearable-display-for-multi-taskers-060237509.html
    Reply