

To increase accessibility to American Sign Language learning, Nvidia Corp. today announced the launch of a new artificial intelligence-powered platform that offers instruction day and night.
The platform is called Signs, a teaching platform developed in partnership with Nvidia, the American Society for Deaf Children and the creative agency Hello Monday. It’s an interactive web platform that uses a camera to give AI-enabled feedback and 3D avatar with voice to teach sign language.
ASL is the third-largest language in the United States, yet it’s underrepresented in AI datasets compared with spoken and written languages such as English and Spanish. To correct this disparity, Nvidia said, it became necessary to build a set of data and validate it.
The company said it trained the AI, which instructs users by reading their gestures and providing them a visual guide, with a dataset it aims to grow to more than 400,000 video clips representing about 1,000 signed words. Each sign will be validated by fluent ASL users and interpreters to ensure its accuracy.
“Most deaf children are born to hearing parents,” said Cheri Dowling, executive director of the American Society for Deaf Children. “Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old.”
Along with teaching sign language, the platform also allows signers of any skill level to contribute by singing specific words to add to Nvidia’s growing ASL open-source video dataset. The company said the dataset is planned for release later this year.
Although Signs currently focuses on the user’s hand movements and finger positions for signing, ASL also incorporates facial expressions to convey meaning. For example, in the sign for hot, the signer places clawed fingers near their mouth and then thrusts them away as if removing something hot from their mouth, accompanied by a facial expression as if they’re exhaling.
Raised eyebrows incorporated into signs can signal questions, and gaze can direct attention to a subject or action. Changing facial expressions during a sentence can convey intensity or stress on a certain part of a statement.
The team behind the app said its exploring how to track these signals and integrate them into future versions. The team is also exploring how regional dialects, slang terms and other variations can be represented in Signs.
It should be noted that ASL, just like American English and Spanish, is a living language with a diverse vocabulary that can vary between different communities. It’s a distinct language that is continuously changing, with new signs being created to convey concepts when the need arises.
“Improving ASL accessibility is an ongoing effort,” said Anders Jessen, founding partner of Hello Monday. “Signs can serve the need for advanced AI tools that help transcend communication barriers between the deaf and hearing communities.”
THANK YOU