New wearable tech translates sign language into text
Wed 25 Nov 2015
A new wearable technology developed by a team of biomedical engineers at Texas A&M University seeks to aid communication between deaf people who use sign language and those who do not understand it.
The arm device contains a network of sensors which track hand movements, as well as the electromyography (EMG) signals generated by wrist muscles when they are electrically or neurologically activated.
Associate professor of biomedical engineering at Texas A&M, Roozbeh Jafari explained that the researchers are able to record the muscle activities captured from the wrist, and indirectly from the fingers, to process and translate the different signals into text in real-time.
The technology requires sophisticated algorithms. As not one person signs exactly the same way, the algorithms have been designed to learn from the users themselves. “When you wear the system for the first time the system operates with some level of accuracy. But as you start using the system more often, the system learns from your behaviour and it will adapt its own learning models to fit you,” said Jafari.
The prototype currently uses Bluetooth to translate the sign language to a computer or smartphone. In the future, the research team hopes to reduce the size of the system so that it is less intrusive on an individual’s arm. They are also looking to build its language proficiency, to translate sentences and phrases instead of words alone. Incorporating a synthetic voice speaker is another area of development, which could help give deaf people a new voice.
There are already several digital apps and services dedicated to helping people with disabilities. The Be My Eyes app, launched earlier this year, is a Danish not-for-profit solution for helping the visually-impaired. It connects users with volunteers who can support them via live video calls with everyday tasks, such as reading labels or locating objects.