So, why hasn't Microsoft come up with this one?
Microsoft's Windows 8 Metro UI may be pretty, but it is not very conducive on today's hardware and existing user expectations. There is considerable doubt whether users will be willing to replace their mouse as main tool for data input and reach across a keyboard and swipe through potentially multiple screens on a bouncing display with their finger. Microsoft could have eliminated some of that doubt by supporting Kinect integration in PC hardware, skip touch control and move directly to gesture control.
Of course, that has not happened yet and may take a few tech iterations of Kinect. How such gesture control could, in fact work one day is being shown by EyeSight. The company already offers EyeCan to support the Metro UI, as well as EyePoint to support a pointing devices for more granular control. Now the company adds EyeKeys, which allows users to map gestures to shortcuts and create their own gesture controls for certain actions. EyeSight says that its technology "works even with cameras of low or varying quality." Of course, it would be foolish to expect detailed control at this level.
We have no idea whether EyeSight works as advertised, but the idea is strikingly common sense and it is rather perplexing that such technology does not have a more prominent space in the Windows 8 launch spectacle. Perhaps Microsoft should have spent less time on Surface and more on gesture control?