Flowchart of eye gesture recognition algorithm. Credit: Xiaoyi Zhang et al.

(Tech Xplore)—ALS is a neuro degenerative illness causing the person to lose motor control. The condition is referred to as the person being in a locked-in syndrome, where patients retain cognitive function but cannot speak and cannot write.

So how can they communicate? Through eye movement. Sam Dean, The Telegraph, noted that ALS is a type of (MND). ALS stands for .

He said that eye movements can become the only way for people with motor neurone disease, which causes muscle wastage through nerve damage. ALS is the most common form, with both upper and lower motor neurone involvement, said the Motor Neurone Disease Association.

And now an app has been designed to help people with ALS speak just using their eyes. Their solution uses a smartphone to capture eye gestures and interpret them.

They tested their eye gesture recognition on recent models of the iPhone and iPad.

A paper has been written on the topic of having developed the system and study results.

"Smartphone-based gaze gesture communication for people with motor disabilities," is by Xiaoyi Zhang, Karish Kulkarni, Meredith Morris.

Zhang is from the University of Washington and the other two authors' affiliation is listed as the Enable team at Microsoft Research.

To appreciate what they have done, it helps to consider the limitations of past attempts to support people suffering from this condition.

Solutions using have been pricey. Also, traditional solutions relying on infra-red cameras do not perform well in situations such as bright sunlight.

The authors said that their algorithm "works in a variety of lighting conditions, including indoors and outdoors. Since it uses an RGB rather than IR camera, its performance is unlikely to be degraded under sunlight."

Another solution out there has been the low-cost, low-tech, eye-gaze transfer (e-tran) boards. This is where a caregiver holds up a transparent board with groupings of letters. And the person with ALS performs eye gestures to select a letter. Here, the limitations are that the process is slow. Also, carrying out the process requires practice and skills for the person interpreting gaze patterns.

The mobile application that this team devised automates that eye-gaze e-tran experience. It can be considered as a low cost alternative as a communication tool—or supplement to other eye-tracking systems. It's easier and faster to use than the actual boards.

How it works: The interpreter holds the phone and points the back camera to the person communicating.

A printed key taped to the phone's case provides the visual indication of letter groupings for the speaker. As Dean explained, when the person wanting to speak looks in a certain direction, the app registers which group of letters is being looked at.

The calibration is simple and easy to perform. This GazeSpeak app has three major components. The three are eye gaze recognition, a word prediction engine and text entry interfaces.

Study results found that the app significantly improved communication time over standard e-tran boards. According to the authors, "Our user studies show that GazeSpeak surpasses e-tran boards (a commonly-used low-tech solution) in both communication speed and usability, with a low rate of wrong recognition."

A video about their research was published on Jan. 16 by Meredith Morris, a co author.

According to reports, the app will be presented in May, at the Conference in Human Factors in Computing Systems.

More information: Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities (PDF)