Tobii Demos SteelSeries Sentry Eye Tracker At CES

Tobii has been working with SteelSeries to develop the Sentry Eye Tracker for some time, and at CES Unveiled we had the opportunity to see it in action. Even though the demo was short and only provided a small glimpse of the device's full potential, it certainly has the ability to take off in the gaming peripherals market.

Our demo featured a man walking through the desert, and by using the Sentry Eye Tracker, players can press a single button to activate the device and use their eyes to pick up an object like a rock and crush it to bits. The lag time between eye movement and the action occurring on the screen was almost nonexistent.

The Sentry Eye Tracker can also be used as the game's camera. Players can look left, right, up, or down from their character to simulate the feeling of actually looking around instead of moving the mouse across a surface. Tobii also has plans to use the device to interact with NPCs in the game. By making eye contact with them, players can begin a conversation. Or start a fight. Whatever you're in the mood for.

The company originally stated that the eye tracker would be available after CES, but they showed us a ready-to-ship product that will cost gamers $199. Considering what we saw at the demo, Tobii is definitely on the right track with the peripheral. While it may not immediately replace mice in the near future, it's certainly an interesting device that adds another layer of immersion during gameplay.

Follow Rexly Peñaflorida II @Heirdeux. Follow us @tomshardware, on Facebook and on Google+.

  • ldun
    I wonder if this could be used to replace a Track-IR setup for flight sim games. MIght be really neat if it's that good.
    Reply
  • JeffKang
    >While it may not immediately replace mice in the near future, it's certainly an interesting device

    Eye-tracking won’t have the precision of a mouse for some time.
    However, you can combine eye-tracking with other inputs.

    Eye-tracking companies have eye tracker features that are for using mouse control and eye control in conjunction.
    E.g. eye-tracking is used to initially teleport your cursor near your target, and then you can use the mouse to precisely place the cursor.

    *Mouse-cursor-teleport user setting: time that mouse controlled cursor must be in rest before eye control is involved again (mouse precision still in use)*

    Tobii has a time setting that determines how quickly a teleport-to-point-of-gaze-upon-movement-of-mouse will occur.
    You can set the time that a mouse-controlled cursor has to be still before moving the mouse will cause a teleport.

    You can decide the amount of time that the mouse has to sit still before eye control is involved again (return of eye control could mean that either gaze controls the cursor again, or the next movement of the mouse will warp/teleport the cursor to the point-of-gaze).
    It’s for, “wait, I’m still using the mouse for stability and precision.
    The mouse-controlled cursor is still working in this area”.

    *Mouse-cursor-teleport user setting: point-of-gaze must be a certain distance from the mouse controlled cursor before eye control is involved again (eye-tracking is activated for larger cursor jumps)*

    Another setting involves deciding the distance from the mouse-controlled cursor that the point-of-gaze has to be before gaze-teleporting is involved.
    It’s for, “some of the targets are close enough, so I can just use the mouse.
    I’ll save eye teleporting for when the distance is large”.).

    Eye-tracking +keyboard: Eye-Tracking doesn’t have the precision of a mouse, but if an interface element and hit state is large enough, a “click-where-I’m-looking at” keyboard button will work.

    Eye tracking + keyboard two-step process: there could be some eye tracking features that allow an eye controlled cursor to snap, zoom, etc. to a smaller target element, or make smaller elements project into large elements.
    Sometimes it’s a two-step process, so even if you have the ability to instantly teleport the cursor, “both-hands-on-keyboard + eye-tracking two-step process” may not be suitable in certain situations.

    Eye tracking teleport + mouse and keyboard: However, whenever you need the mouse, eye-tracking will still be there to provide an initial cursor teleport.

    Without eye-tracking: If you have both hands on the keyboard, you lose time switching one hand to the mouse, and bringing the hand back to the keyboard.
    You’re usually choosing between both hands on the keyboard, or one hand on the mouse.

    With eye-tracking: With eye tracking, it can be used either with both hands on the keyboard (click-what-I’m-looking-at keyboard button), or one on the mouse (initial cursor teleport, then use the mouse).
    You never have to forgo something to use eye-tracking; it’s always ready to make normal computer interaction faster.

    *Eye-tracking can make on-screen buttons, and thus macros more prevalent*

    Eye-tracking can make macros more popular because eye-tracking allows for easier activation, and thus more use of custom widgets and on-screen buttons.
    A collection of custom on-screen macro buttons with recognizable, self-documenting text labels is easier to maintain than a collection of Control + Alt + Shift + <whatever> keyboard shortcuts for activating macros.
    i.e. Tasker macros on mobile have a better chance for adoption than AutoHotkey or AutoIt shortcuts on the desktop.
    Reply