UPDATED 22:05 EST / MAY 16 2024

APPS

Google’s Project Gameface lets people control Android devices using gestures and facial expressions

Google LLC is bringing a host of new accessibility upgrades to Android, with the headline feature something known as Project Gameface, which uses artificial intelligence to translate facial expressions and gestures into on-screen actions.

Project Gameface was first conceived as a kind of hands-free mouse for gamers, but Google quickly realized that the technology behind it can be applied to many more use cases, notably increasing accessibility for disabled users. The company is now making Project Gameface available to Android developers, who will be able to incorporate facial gestures into their applications.

The news was delivered today following Google I/O, the company’s annual developer conference. In a blog post today, Eve Andersson, senior director of Products for All at Google, said the technology works by using the device’s camera alongside the MediaPipe Face Landmarks Detection application programming interface, to track user’s facial expressions and gestures and translate them into cursor movements.

Gameface is actually an open-source project, and Andersson said the tech will be especially useful for anyone with disabilities, enhancing their ability to interact with digital environments. For instance, users can set it up to recognize a gesture such as a smile or raised eyebrows to perform actions such as clicking and dragging.

It’s being made available to developers with new enhancements. It can now recognize a total of 52 different facial gestures, which the users can assign to various commands. So users will be able to interact with their devices in countless ways simply by making different gestures.

Google didn’t showcase any applications that are currently using the technology, but it said it’s partnering with the Indian social enterprise Incluzza to explore how it can be used in offices, schools, social gatherings and other settings.

Lookout app enhancements

Google announced various other accessibility enhancements besides Project Gameface, including an update to the Lookout app that’s designed to help blind and low-vision users interact with their Android devices. The new update pertains to a “Find mode,” available in beta test now, which helps users to find specific physical objects and rooms located nearby.

The way it works is users open their camera and pans it across the room. The app will then identify what it says, and help users find objects such as the TV remote, empty seating or an empty table in a restaurant, locate public bathrooms and so on.

In a second update to Lookout, Google is building on the generative AI feature rolled out earlier this year that provides detailed descriptions of images uploaded by the user. In addition to uploading photos, users will be able to snap photos from directly within the app, the company said.

Additional accessibility updates

The Look to Speak app that enables people to choose prewritten, customizable phrases using their sight before speaking them aloud, is being updated with a text-free mode. With this, users will be able to select and personalize emojis, photos and symbols, expanding the capabilities of the app to overcome language barriers and literacy challenges, the company said.

Meanwhile, Lens in Maps, which uses AI and augmented reality to locate restaurants, train stations, ATMs and what-not in someone’s immediate vicinity, is adding detailed voice guidance in the shape of TalkBack, a screen reader technology introduced earlier this year. Lens in Maps will not only describe in a natural voice how the user can get to where they’re going, but also provide additional information such as opening hours, ratings and so on. In addition, it will assist users by letting them know if they need to cross a busy intersection, or navigate some kind of obstacle or barrier that’s blocking their path.

Maps is also getting more accessibility information with the addition of a new wheelchair icon that denotes wheelchair-accessible places. It will describe just how accessible each place is, with information on their bathrooms, parking and seating, for example. This feature is not only coming to Android and iOS, but also to the desktop version of Maps. Users will also be able to filter search results in Maps based on wheelchair accessibility.

Finally, Google announced that business owners will be able to add the Auracast attribute to their business profiles in Google Search and Maps, so people in need of hearing assistance will be able to find them more easily. Auracast is a technology that enables public venues such as theaters, gymnasiums, museums and places of worship to cast audio descriptions and assistance to people via Bluetooth-enabled hearing aids, earbuds and headphones.

Image: Google

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU