A novel lightweight CNN approach for Bangladeshi sign language gesture recognition
Abstract
The impairment of speech impediment affects 6.9% of Bangladesh’s population.
This is a condition in which people cannot communicate vocally with others or
hear what they are saying, causing them to rely on nonverbal means of commu nication. For such persons, sign language is a common way of communication in
which they communicate with others by making various hand gestures and mo tions. The biggest problem is that not everyone understands sign language. Many
people cannot converse using sign language, making communication between them
problematic. Even though translators and interpreters are available to assist with
communication, a more straightforward method is required. We propose a method
which uses deep learning combined with some computer vision techniques to detect
and classify Bangla sign languages to close this gap. Our custom-made CNN model
can recognize and classify Bangla sign language characters from the Ishara-Lipi
dataset with a testing accuracy of 99.21%. To recognize the precise indications of a
hand gesture and understand what they mean, we trained our model with sufficient
samples by augmenting and preprocessing the Ishara-Lipi dataset using various data
augmentation techniques.