uSens unlocks $20M to pioneer natural mobile VR human interaction
Today Augmented Reality and Virtual Reality (ARVR) pioneer company uSens, Inc. announced a $20 million Series A funding round led by Fosun Kinzon Capital. The company intends to use this new investment to launch its powerful 26DOF (degrees of freedom) hand tracking and 6DOF head tracking software library. The solution developed by uSens enables developers to write mobile ARVR software to create naturally interactive apps without needing to rely on peripherals.
Additionally, returning investor Maison Capital joined this funding round and was joined by new investors Great Capital, Fortune Capital, Oriental Fortune Capital, iResearch Capital and Chord Capital. To date, uSens has raised over $26 million in venture capital to produce reference hardware and software for head and gesture tracking–the next generation of user interaction.
This year the ARVR market saw the launch of the first consumer entry level headsets and viewers and included an increased interest in mobile VR. In 2016, the market saw the release of the Oculus Rift and HTC Vive and expects the arrival of the PlaySation VR headset. The mobile VR market, expected to dominate VR experiences (with 87 percent of the predicted $865 million in 2016), has seen huge interest in the Google Cardboard and the Gear VR viewer—both of which rely entirely on the sensors in the phone to detect head tracking but cannot “see” user gestures or hand movements.
On the horizon is the Google VR mobile viewer Daydream, which will use a wireless peripheral similar to Nintendo Co., Ltd’s Wiimote to detect user gestures.
“Tracking is critical to ARVR experiences,” said Anli He, CEO and co-founder of uSens. “As ARVR display technologies approach mass adoption, we’re excited to bring great interactive solutions to help ARVR platforms, hardware makers, and especially content developers overcome the complicated challenges of hand and position tracking.”
The technology developed by uSens will be demonstrated live this week at Augmented World Expo 2016 (AWF) in Santa Clara, CA.
Anli He went on to tell SiliconANGLE the vision behind uSens’s approach to the market. “Right now the display technology for ARVR is developing quite well, but interaction is quite a problem,” she explained. “It’s hard for users to interact and play with these applications. So we are focused on solving this problem.”
As mentioned above, the mobile market for ARVR experience is expected to dominate and that’s where uSens is aiming its solutions as middleware offered to developers to enable head and hand tracking. The mobile market in particular has a lot of challenges when it comes to controls–for example it took a while for smartphones to come into their own with the touch screen UI–so too ARVR has its own killer app problem when it comes to controls.
With mobile ARVR in practice users will find immersion a problem if the only controls available are handheld, by providing natural human interaction by enabling the tracking of hand movements uSens hopes to do away with the need for controllers. Furthermore, controllers themselves in a mobile world become cumbersome and add yet another thing that can be misplaced or lost. A mobile ARVR viewer with the ability to recognize natural human gestures for UI interaction would become a put-on-and-go experience for mobile users.
Positional tracking hardware and software for the mobile ARVR experience
Much of science fiction has promised that virtual reality won’t just be immersive but a near total escape from the living room or office into an entirely other world. In fiction and reality, it’s become understood this may be difficult if users must rely on peripherals (such as wands or air-mice) in order to interact with virtual objects.
Reaching out and grabbing a virtual reality object feels more natural than touching it with a peripheral wand or pressing a button.
The flagship product of San Jose headquartered uSens is supported by a reference prototype which is a low-power hardware board capable of delivering head tracking with six degrees of freedom and, via mounted sensors, over 26 degrees of freedom when detecting hand motions and gestures. It provides a 90 degree viewing angle from the face (similar to the HoloLens in practice) at 60 frames per second and sub-millimeter accuracy. It’s also capable of doing head tracking and gesture recognition with a five meter movement range (far more than the normal arm’s length needs).
The reason uSens produces a reference prototype is so that mobile hardware developers can develop their own camera systems capable of specifications needed for optimal head and gesture tracking.
As for developers, uSens provides a full SDK and documentation compatible with existing VR mobile apps developed with Unity, Google Cardboard SDK, etc. To ease the transition for developers into ARVR applications, the Pi UI 3D user interface is available for app browsing and playing media content.
The documentation currently on the website is still being populated by the company, but full SDK reference for the supported platforms will follow shortly. The uSens website also contains further details on the software libraries, reference hardware and vision of the company. Hardware and software developers can contact uSens (form at the bottom of the homepage) to get more information about how to integrate with their solutions.
Featured image credit: uSens, Inc.
A message from John Furrier, co-founder of SiliconANGLE:
Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and soon to be Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
We are holding our second cloud startup showcase on June 16. Click here to join the free and open Startup Showcase event.
We really want to hear from you. Thanks for taking the time to read this post. Looking forward to seeing you at the event and in theCUBE Club.