Big Things Planned for Microsoft’s Kinect for Windows 1.5


May 2012 will see the release of a new version of the Kinect software for Windows and this week Microsoft’s Craig Eisler published a sneak peek into what we can expect from this update. Looking at the array of SDK options, the functionality will be formidable: four new languages for speech recognition, included capability to understand accents/regional-pronunciation, “10-joint” skeletal tracking, and even a “near” model mode.

The May release of the SDK is expected to affect Hong Kong, South Korea, and Taiwan—with a one month lag for other regions with June leading to the release of Kinect for Windows 1.5 in Austria, Belgium, Brazil, Denmark, Finland, India, the Netherlands, Norway, Portugal, Russia, Saudi Arabia, Singapore, South Africa, Sweden, Switzerland and the United Arab Emirates. This release will stack atop the beta release of the software from February 1st and bring it into its next era.

“In a future blog post, I’ll discuss the features and capabilities we are releasing in more detail,” Eisler continued. “We are excited by the enthusiasm for Kinect for Windows, and will continue to work on bringing Kinect for Windows to more countries, supporting more languages with our speech engine, and continuing to evolve our human tracking capabilities.”

Ever since the initial release of the Kinect hackers have descended upon it with a gusto and a lust for how effective and amazing it is as technology. The Kinect’s gesture recognition technology has springboarded it out of the realm of just a video-game controller gimmick and into a wide variety of applications from augmented-reality proection with the “Beamatron”; Whole Foods seeking a Kinect-guided shopping cart to follow and assist shoppers; even acting as a waldo interface for a NAO robot (used here to pet a cat). The applications for this technology are simply staggering.

Now that the SDK is licensed and there’s software written by Microsoft for Windows, developers will have even greater access to this amazing technology’s possibilities.

We’ve been seeing it incorporated into a multitude of other places, including a crazy luxury Mustang wired up with Windows 8 and a Microsoft Kinect,

 A concept car that’s been created to inspire developers to think about building apps and automotive technologies of the future. Connected-device scenarios featured in the car incorporate Kinect for Xbox 360, Xbox 360, Windows 8, Windows Phone, Windows Azure, Bing, Ford Sync and more.

To create Project Detroit, a 2012 Ford Mustang with a 1967 fastback body, Microsoft teamed up with Ryan Friedlinghaus, an award-winning automotive designer based in Corona, Calif., and star of the Discovery Channel’s Velocity network reality TV series “Inside West Coast Customs.”

While I certainly don’t expect that cars will be the highest on the list for including gesture-recognition, it’s a good place to put a hands-free UI. Tie that in with smart glass, the driver (and passengers) could use a single hand to manipulate the environment within the car without ever getting their hands too far away from the wheel, and potentially without having to take their eyes off the road to make adjustments. The Kinect could possibly even be used to optimally deploy airbags by tracking the skeletal position of the driver (and passengers) during a collision to prevent serious injuries.

We’ve even seen this gesture-recognition technology put to good use in operating theaters where surgeons shouldn’t be touching x-ray films or other printouts; a simple wave of the hand at a large screen and they can pivot, twist, zoom, and change perspective on medical imaging:

Gesture support for manipulating images is also only the first most useful effect of the Kinect. Eventually we might even see 3D imaging being brought to bear with multiple Kinect cameras looking down at a patient, producing their own 3D imaging, comparing the skeletal movements of the surgeon with known models generated via CT, and producing advice or warnings to the surgeon.

With Kinect for Windows leading into the next upgrade, the sky is still the limit on the penetration and how useful this technology will be to our everyday lives.