Basic bangla sign language recognition and sentence building using Microsoft Kinect
Abstract
How do a deaf of mute people communicate with each other? Or, even bigger question, how
do they make other people understand their messages? The answer is, Sign Language.
It is not that easy, most people do not know sign languages. To improve that situation, there
have been many researches going on to translate sign language into spoken language. The
main purpose of our thesis was to build a system similar to that in Bangla language.
When one signs a language, he mostly uses his hands and head. The facial impression is also
important. In our system, we tracked the skeleton with the help of Kinect. The system
performs hand segmentation, finger identification and the number of fingers the signer has
used using K-curvature algorithm while making the gesture. To determine the movement, the
information about the skeleton joints is checked from the data set for each gesture. Finally the
system shows the output for which the conditions are matched. Thus, the system recognizes
the sign language gestures.
Keywords: Bangla Sign Language, Kinect, Skeleton Tracking, Finger Identification, Kcurvature
algorithm.