Talking Hands

Full video can be viewed here

About

Talking Hands is a augmented reality app prototype that allows the user to type using the American sign language (ASL) alphabet. Aaron McLean, Rachel Tojio, and I worked on this project for a ICS 486 XR/AR assignment using Meta XR SDKs for Unity and the Meta Quest 3/3S. The project also included the XR Hands package, which provided the hand model, the hand joint tracker (the UI with the bars), and the ability to trigger events via hand gestures. Using XR Hands, I created the gestures for most of the alphabet (except for J and Z) and made them write their letter to the text box in front of the user. In addition, I implemented the “palm up” menu for deleting letters or adding a space.

The limited hand tracking of the Quest 3/3S made it difficult to implement certain letters in the ASL alphabet, such as R, U, M, and N. The headset could not track the crossed fingers for R, which made it seem like a U. This was resolved by having a collider appear behind the hand to let the user clarify which letter they are signing (see video below). The headset also could not track fingers that were obscured by the hand, especially for M and N. My group and I thought of tweaking the gestures to something that can be tracked more reliably and still be recognized as M or N. We decided to use the old signs for M and N instead, which was marginally better for tracking.

Modern:

Old:

Experience

This was my first time programming something for virtual/augmented reality and using a VR headset. Unlike a 2D screen, I was able to see how close or far an object is and physically move around them. This made setting up the UI interesting since it can not be too close to the user’s face, but still appear over other objects in the scene. The Quest 3/3S hand tracking was also interesting to work with. Instead of keys presses and mouse clicks, I can use hand poses and motion to interact with the application.