

In what some might call an unholy tandem, a Microsoft Kinect and Nintendo Wii have been hybridized together to permit a man—in this case software engineer Taylor Veltrop—to remotely brush his cat via 23-inch-tall robotic surrogate.
To make this happen, Veltrop hacked together a Kinect, two Wiimotes, a head-mounted display (HMD) visor, a treadmill, and a Nao robot. Tiny cameras on the NAO robot actively broadcast what the robot is seeing back to the HMD and Veltrop hacked the Kinect and Wiimotes to allow him fine control over the motions of the robot arms—combined with the treadmill he could walk up to the cat, grasp the brush, and gently brush the cat.
“This is the culmination of my last year’s work,” writes Veltrop. “I control the robot’s arms through the Kinect and Wii remotes. I control the robot’s navigation through the Kinect and treadmill. I control the robot’s head through the head mounted display (HMD). I also see through the robot’s eyes with the HMD.”
His project wasn’t without its trial an error, he adds, “This was the second try accomplishing brushing the cat. On the first try we discovered that the positioning of the brush in NAO’s hand needed to be flipped 180 degrees from the intuitive direction.”
It’s been long an argument of mine that augmented reality can cut both ways—while for the everyday person, aug-reality is all about offloading processing power from the brain into the cloud via mobile devices and adding context to reality, but there’s also the surrogacy angle.
The NAO robot has an excellent interface and has been eyed by many Kinect engineers and developers for attachment. In fact, Taylor Veltrop himself began is one-year-journey by announcing his project using the Aldebaran NAO Robot and the Kinect for this very demonstration in 2010. Shortly thereafter another engineer submitted a video showing off the robot picking up a ball; but at that time Veltrop had already gotten the NAO to chop food in the kitchen with enhanced hand-coordination [VIDEO].
With this sort of technology, remote application of human dexterity could be possible for manipulation of objects in hostile environments. For example, Fukushima and other nuclear sites could benefit from having remote manipulators, search-and-rescue could deploy robots with telepresence with humans; it’s a surprise that NASA isn’t well publicized for developing this sort of technology for space EVA. I know quite a few vulcanologists who would love to have devices like this for fine manipulation of molten lava while studying volcanos.
You can learn more about Taylor Veltrop and his dreams of electric sheep from his blog. For the rest of us attention-span addled technologists, there’s the video of him controlling a robot to pet a cat named Lotus.
Support our open free content by sharing and engaging with our content and community.
Where Technology Leaders Connect, Share Intelligence & Create Opportunities
SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.