UPDATED 15:28 EDT / OCTOBER 01 2013

NEWS

Microsoft is Getting Ready for Kinect-Controlled Windows PCs

Microsoft continues to support the SDK for developing applications that allow users to connect the Kinect with a regular PC computer, and now Microsoft has revealed a way to use Kinect for gesture-based navigation with Windows.

Microsoft’s research department at Research labs in Cambridge demoed a prototype gesture-controlled PC using an augmented version of its Kinect motion sensing system. The rig detects 16 gestures and can be used to navigate Windows 8. The technology allows you to dramatically improve the ability of detection and tracking of hands possessed by Kinect. The improved version allows you to use your hands to perform the gesture such as those made famous by the touchscreen tablets and smartphones.

Users can, for example, keeping both hands to zoom or move them to scroll through the screens in practice by performing those functions that previously required a touchpad or mouse. However, Microsoft said the gesture control is not designed to replace mouse and keyboard.

“What we don’t want here is Tom Cruise in minority report,” said Abigail Sellen, Microsoft principle researcher, said to The Verge. “What gestures are good at are being very casual, expressive. What they’re not good at is being precise. Touch is good at that, mouse pointer is good at that. We don’t need to do that with gestures.”

Gesture Control for Future PCs

If gesture control technology is integrated into future of PCs, windows users may control the playback of audio and video, switch from one program to another or making play with a simple movement with arms or hands. The gestures include swipes, clasps and pinches and is detected via a Kinect positioned above the keyboard, which monitors the movement of the hands above the keys, capturing the hand movements and position in 3D space.

For Microsoft it is essential that the actions that are performed with greater ease without the use of gesture continue to be carried out in the traditional way. By clenching a fist to grab it and then opening the hand while moving towards the top of the keyboard, a user can maximized a window. The same process in a reverse way minimize the process. Move the hand to the left or right edge of the keyboard, the windows will then move to left or right edge of the screen. Similarly, one can scroll through web pages without needing an extra trackpad.

Microsoft aims to improve the overall PC experience using these gestures, rather than replacing the keyboard and mouse. The device can detect 16 different kind of hand movement.

“We’re not doing anything precise, and nothing that requires you to hold your hands up for a long time,” explains Sellen.

 


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU