We believe that communication is a basic right. The better we communicate and share ideas, the faster we progress as a society. The project aims to bridge the gap between people who are speech impaired and the people who don’t know sign language.
The hand gestures, as defined in the American Sign language, are converted to speech output based on the input from the Inertial Measurement Unit and the flex sensors.
A three layered Artificial Neural Network has been implemented with keras as the front end and theano as back end for classification of gestures. The processing has been done on a Beagle Bone Black.
The project is an effort to try and assist the 70 million people who can't communicate vocally.