Best Android app

Gaze Link

Helps Amyotrophic Lateral Sclerosis (ALS) patients communicate with their eyes

What it does

Amyotrophic Lateral Sclerosis, or ALS, is a devastating disease that takes away the patient’s ability to move and speak. As a volunteer in ALS associations during the summers of high school, I realized that some patients can only communicate with their eyes and assistive technology, which has many limitations like cost and efficiency. Powered by the Google Gemini API, my free multi-language app named “Gaze Link” helps ALS patients communicate with their eyes independently, accurately, and efficiently.

First, I recognize the user’s face and eyes with Google ML Kit and OpenCV. After a 30-second calibration and setting adjustments, the user can begin typing words on Gaze Link’s multi-language keyboard with 6 eye gestures. However, eye-typing can be a very slow process for long sentences.

To improve the text-entry rate, I used a Gemini 1.5 Flash model to generate the patient’s intended sentence based on keywords and the context. First, Gaze Link will transcribe the caretaker’s voice into text like “Is the room temperature ok?”. Then, the patient will type keywords like “hot, AC, two” with their eyes. The Gemini model will use the information to generate a suitable sentence like “I am hot, can you turn the AC down by 2 degrees?” in under a second. The model and keyboard also works with Spanish and Chinese. Experiments with 30 people show that the model can save up to 85% of user keystrokes and make Gaze Link 7x more effective than traditional E-transfer boards.

Built with

  • Android
  • Firebase
  • Google ML Kit

Team

By

Xiangzhou Sun

From

United States