Bengali hand sign language recognition using convolutional neural networks
Abstract
Throughout the world the number of deaf and mute population is rising ever so
increasingly. In particular Bangladesh has around 2.6 million individuals who aren't
able to communicate with society using spoken language. Countries such as Bangladesh
tend to ostracize these individuals very harshly thus creating a system that can allow
them the opportunity to communicate with anyone regardless of the fact that they
might know sign language is something we should pursue. Our system makes use of
convolutional neural networks (CNN) to learn from the images in our dataset and
detect hand signs from input images. We have made use of inception v3 and vgg16
as image recognition models to train our system with and without imagenet weights
to the images. Due to the poor accuracy we saved the best weights after running the
model by setting a checkpoint. It resulted in a improved accuracy. The inputs are
taken from live video feed and images are extracted to be used for recognition. The
system then separates the hand sign from the image and gets predicted by the model
to get a Bangla alphabet as the result. After running the model on our dataset and
testing it, we received an average accuracy of 99%. We wish to improve upon it as
much as possible in the hopes to make deaf/mute communication with the rest of
the society as e ortless as possible.