UPDATED 16:07 EDT / OCTOBER 01 2013

NEWS

Microsoft Kinect PC Support Promises Enhancements to Augmented Reality, Human-Machine Communication

The Microsoft Kinect system heralded a whole-new-way to control gaming consoles, but it became immediately obvious that it was much more than that. Within months of its release software engineers of every stripe saw its potential as a system for turning the human body into a controller and permitting more emotive reactions in the realm of machine-human communication.

It did take Microsoft a few months to warm up to the idea (with a little bit of a stumble on the way) but the company seemed to realize that it was better to work with the fans of the product instead of against them. With that, a new era of how Kinect could be used was born and Microsoft even opened up a research center dedicated to augmented reality and using the Kinect to produce a better machine-human landscape.

On this same theme, Microsoft has continued this dedication with further support for the Kinect PC SDK and integration into Windows 8 and desktop navigation.

“We have a skeletal model of the human hand and we will literally just say ‘Here’s the base pose’ now perturb it. It just uses randomization on all of the joints, knowing how the joints are allowed to move, moves them around in space and generates as many images as we need,” said Chris O’Prey, a Microsoft senior development engineer who is one of the builders of the system, said in a quote to ZDNet.

“It generates images that look like they’ve come from a Kinect, with the right noise, depth and image aspects. That’s what we use to build our machine learning processes and because we can do that we can build a classifier [for a gesture] in the space of about a day.”

Microsoft has also added software to track the hand’s position 70 to 90 percent of the time relative to the desktop and the keyboard–higher than the usual machine learning bracket of 60 – 65 percent. It uses highly developed algorithms of Kinect but the researchers been augmenting the process further.

“We designed a way of looking at body parts, as well as body poses at the same time. Rather than just saying ‘Here’s your arm and here’s your hand’. What its saying is ‘here’s your arm, here’s your hand and by the way the hand’s fingers are splayed and pointing downwards’.”

In addition to switching between applications and the use of gestures of the body, hands and arms, the Kinect technology would also be possible to realize interaction systems available for people with disabilities.

The future of augmented reality and machine-human communications

Gesture control is all part of user experience (or UX) which is part of the essential of user interface. In a previous quote, Microsoft researchers have said that they do not want to develop something like the Minority Report interface (a precise, in-air holographic gesture control) but instead produce something that augments other forms of control. Keyboard and mouse would still be a major component of computing; but a user might be able to scroll, change windows, zoom, or do other simple tasks with a simple gesture without having to hotkey or reach for the mouse.

Hands free has other applications and 3rd party developers could move into them readily (especially since Microsoft has gotten out of the way with the SDK and the acceptance of the MAKER community.)

In this area happens to be the user experience of users with disabilities. Keyboards, mice, and other controllers that require fine finger movements could be a problem for individuals with carpal tunnel. A gesture control that can take gross movements of hands and arms and translate those into controls could greatly reduce the stress computing has on people whose hands don’t need to suffer from further repeated stress related injuries.

Individuals without much control over their limbs could also use a Kinect-interface to make better use of head movements to control computers or even other machines around the house.

The potential is almost limitless and the community trying out the capabilities of the Kinect SDK continues to mature.

Contributors: Kyt Dotson and Saroj Kar


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU