Most Active Stories
- Why are teachers leaving Oakland?
- The first look inside San Francisco's radical attempt to end homelessness
- Is Oakland’s DIY music scene in serious trouble?
- Everybody disagrees on how to solve San Francisco’s affordable housing crisis
- Putting an earring in my ear: the centennial of the Armenian Genocide
Translating Sign Language For The ASL-Illiterate
Remember Word Lens, the iPhone app that lets English speakers translate other languages in real time?
Now an analogous app exists to facilitate communication between deaf people and those who don’t understand sign language. From Popular Science:
Students at the University of Houston designed a device called MyVoice, which uses a video camera to capture a person’s sign language movements. It also contains a small video monitor, a microphone and a speaker. Software processes the images and determines what was said, and then translates the word or phrase into speech, which is transmitted through an electronic voice.
The team said the biggest challenge was amassing the images that the app would “read” – 200 to 300 for each sign.
This article was originally published on TurnstyleNews.com on June 11, 2012.