Image translation of Bangla and English sign language to written language using convolutional neural network
Abstract
One particular thing that differentiates humans from other species is their abilities to
interact. To communicate with others, humans invented languages as units. There
are 6500 Languages in this world for people of different places to communicate with
each other. Among them, English has been established as a global language. As
Bangladeshis, Bengali is our mother tongue and primary language to express our
thoughts and feelings. However, there are a ton of physically disabled human beings
who are deprived of expressing their emotions through verbal language. Therefore,
Sign Language has been discovered. Expressing feelings with the help of signs is
a type of Nonverbal Communication which is mainly done by moving body parts:
hands in particular. Just like English, our mother Language Bengali has its own sign
language consisting of 36 symbols of alphabets with its own grammar and lexicons.
To resolve two way communication and a better understanding in communicating
through Sign Language, in this thesis, the advantages of Real world pictures of
Bangladeshi Sign Languages will be used to run an algorithm which will convert
Sign Language to Written Language using Sequential Convolutional Neural Network
Method. The system will be able to detect both ASL and BdSL regarding any
background with the accuracy of 95.23% and 98.45% respectively.