Remember Word Lens, the iPhone app that lets English speakers translate other languages in real time?
Now an analogous app exists to facilitate communication between deaf people and those who don’t understand sign language. From Popular Science:
Students at the University of Houston designed a device called MyVoice, which uses a video camera to capture a person’s sign language movements. It also contains a small video monitor, a microphone and a speaker. Software processes the images and determines what was said, and then translates the word or phrase into speech, which is transmitted through an electronic voice.
The team said the biggest challenge was amassing the images that the app would “read” – 200 to 300 for each sign.
This article was originally published on TurnstyleNews.com on June 11, 2012.