proofreadLearn more about out editorial process and policies
The first application capable of recognizing and interpreting the Spanish sign language alphabet
The University of Alicante (UA) Robotics and Three-Dimensional Vision Group (RoViT) has designed the first application capable of recognizing and interpreting the Spanish sign language alphabet (known by the acronym LSE) in real time.
The app is called Sign4all and is a breakthrough that helps to break down communication barriers between deaf and hearing people in everyday situations such as going to a doctor's surgery or eating in a restaurant. According to the latest Survey on Disability, Personal Autonomy and Dependency Situations of the Spanish National Statistics Institute (INE), there are 1,230,000 people in Spain with a hearing disability of different types and degrees. Of these, 27,300 people use sign language to communicate.
Thanks to the use of different computer vision and deep learning techniques, Ph.D. computer engineer and UA researcher Ester Martínez, together with Ph.D. student Francisco Morillas, has developed this low-cost tool in order to be able to offer assistance to deaf people when they cannot be assisted by an interpreter.
Sign4all, after extracting the detail of the skeleton of the arms and hands, codes the left side of the body in blue and the right side in red, keeping the user's anonymity at all times. From this moment on, the application translates the sign used by the deaf person in real time and in the opposite direction. It is able to use a virtual avatar to sign Spanish words typed by the hearing person. The idea is that this whole process can be done by downloading an application and using the camera of the mobile phone or tablet itself, so that it can be used anywhere, as explained by Martínez.
According to the UA researcher, after many tests, Sign4all manages to interpret and recognize the LSE alphabet with an accuracy of 80%. Although this result corresponds to the dactylological alphabet, they are working on a version with a specific vocabulary belonging to the field of everyday tasks where they can interpret complete sentences.
The UA team has been training this new system for months, introducing more and more signs. In this sense, a collaboration has arisen with the University of Vigo Spanish Language and Signed Languages Research Group (GRILES), a team with extensive experience in the study of this language and its use in different territories. The University of Vigo is collecting images with interpreters and the UA is processing all this data. In this way, the vocabulary of the LSE recognition and interpretation system can be improved and expanded much more quickly.
The research is published in the journal Computational Intelligence and Neuroscience.
More information: Ester Martinez-Martin et al, Deep Learning Techniques for Spanish Sign Language Interpretation, Computational Intelligence and Neuroscience (2021). DOI: 10.1155/2021/5532580