Tobii Focuses On Foveated Rendering, More Games Get Eye-Tracking Support

Tobii continues to make headlines with its advancements in eye-tracking technology. Two months ago, the company released a new eye tracker, the Tobii Eye Tracker 4C, with a new chip that reduced the peripheral’s overall CPU load and power consumption. At CES this year, the company didn’t release a new device, but rather announced a few other things, such as more games that will support eye tracking, its first steps into integration with mobile devices, and the next step of combining eye tracking technology with virtual reality.

New Directions In VR And Mobile

The next challenge in virtual reality is all about power--specifically, how to use less of it and still provide the best experience in a virtual environment. Last year, we learned about foveated rendering, which uses eye-tracking software to render only the things you see in your field of view. This would save precious resources and reduce the overall load on the GPU.

This is the next stage in Tobii’s ongoing research with eye-tracking technology. If it can find a way to use eye-tracking to reduce the GPU load, it can lead to other technologies that would further optimize the VR experience. In addition to VR, the company also wants to bring the same technology to traditional PC gaming.

However, Tobii was a bit vague on its plans with foveated rendering, and rightfully so. Other companies, such as Fove, QiVARI, and SensoMotoric Instruments are all working on bringing eye-tracking to VR. More recently, Oculus acquired Copenhagen-based company The Eye Tribe, which is yet another eye-tracking technology group. In a way, it’s a race to see which can provide a definitive foveated rendering experience, and Tobii believes it can makes waves in the area this year.

In addition to VR, Tobii also made some headway into integrating eye-tracking into mobile devices by integrating its software into the Huawei Honor Magic smartphone last month. The exact functions of Tobii’s integrated technology aren’t clear, but we do know that the smartphone uses Tobii’s algorithms as well as an infrared camera. The result is that the “device can better understand the user’s current state, has greater insight into the user’s intention, and is better able to accommodate their actions.” It’s the first step towards the marriage of eye-tracking and mobile integration, and it won’t be the last time we hear of Tobii in both the VR and mobile spaces.

More Games

However, the main attraction for Tobii’s eye-tracking technology is still on PC games. Last November, Ubisoft’s Watch Dogs 2 and Steep were added to Tobii’s list of supported titles. Now, two more games will join the club: Crystal Dynamics’ Rise of the Tomb Raider and Techland’s Dying Light. The addition of these games marks a total of 45 titles that support the eye-tracking technology.

Even though there are advancements to be made in VR and mobile, it seems that Tobii’s bread and butter lies with PC gaming. More games are bound to get eye-tracking support in the coming months, and with yet another eye tracker available to the public, Tobii is slowly but surely becoming a household name.

  • Jim90
    Definitely a tech, once perfected, that will become the norm.
    Simply focus on one character on the screen and you get the immediate realisation of how much PSU, CPU and GFX power could be reduced. Hugely significant!! more savings?... let's see.... reduced video memory requirements. Now multiply up all the personal display screens in the planet. If 'spectators' are close enough to be captured then they, too, could be added in.
  • JeffKang
    >The exact functions of Tobii’s integrated technology aren’t clear, but we do know that the smartphone uses Tobii’s algorithms as well as an infrared camera.

    What is the cost of adding the infrared camera?

    There might be a better approach that doesn't require a special camera.
    A MIT research group is trying to provide eye tracking control for everyone using basic hardware (tablets, smartphones, webcams).

    They created a deep learning app that crowdsources eye data:


    >Eye Tracking for Everyone
    >K. Krafka*, A. Khosla*, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik and A. Torralba
    >IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016

    The app session only takes about a minute to complete.
    Look at the dots that randomly appear.
    Tap the left side of the screen if you see an L, and the right side of the screen if you see an R.

    The current accuracy with 1500 people is about 1 centimeter on a mobile phone, and 1.7 centimeters on a tablet.
    If the researchers can get data from 10,000 people, they think that they can reduce the error rate to 0.5 centimeters, which should be good enough for many eye-tracking applications.

    Also, there might not have to be use of infrared lights.

    >“The field is kind of stuck in this chicken-and-egg loop,” says Aditya Khosla, an MIT graduate student in electrical engineering and computer science and co-first author on the paper.
    >“Since few people have the external devices, there’s no big incentive to develop applications for them.
    >Since there are no applications, there’s no incentive for people to buy the devices.
    >We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”
  • Jeff Fx
    Right now PC gaming seems like the best use for this, so they're smart to focus on that.
    This will be useful for VR when lenses are improved, but right now, you could just fully render the sweet spot in VR, and drop quality everywhere else, since we already have to move our heads instead of our eyes to see things clearly in VR.