In the near future you could be interfacing with certain applications simply with your eyes. Students at the Imperial College London have shown that such a device could be built for less than $30.
The idea is not as revolutionary as you may think and uses commonly available hardware. The students used game console cameras capable of tracking 3D objects, such as Microsoft's Kinect. However, instead of tracking an entire body, two cameras were placed on a head-mounted construct "outside of the user's field of vision", and were recalibrated to track pupils. According to the students working on the project, the concept worked well, even if the technology was just demonstrated with a simple Pelota-like game.
A commercial product would need "appropriate" funding and about three more years of development, the researchers said.
Contact Us for News Tips, Corrections and Feedback
True, but then again the cursor while playing a game is always at the center of the screen, so when looking at anything on the screen the cursor will frantically attempt to move to that part of the screen hundreds of times a second.
Basically, you would be spinning around uncontrollably in-game.
I had this issue when I tried to use other motion technology before, the game has to be designed to use it.
We actually experimented with that back in the late 70s !
Think of the Wii. Call of Duty on the Wii involved having free aim of your gun all over the screen and only moving when you reach certain deadzones. This was clunky in itself but I was still able to become pretty accurate after playing for awhile. Now think of PC gaming where your eyes have control of where your gun is pointed on the screen, and your mouse allows you to move where your camera is looking. This actually sounds awesome to me, much better than the virtual reality stuff everybody else is talking about.
way too expensive and "useless" lol