A smart avatar tutor for mimicking characters of Bangla sign language
Abstract
This paper presents an efficient novel approach to developing a smart avatar tutor
for teaching Bangla Sign Language (BSL). The avatar is designed to mimic the
numbers and alphabets used in BSL, providing an interactive and engaging tool for
learners. The system employs image processing and machine learning techniques to
identify and analyze hand gestures and signs in a dataset, generating corresponding
animated sequences that the avatar can use to replicate them. Following preliminary
pre-processing of the dataset, we collected hand gestures and signs to train the
system. The avatar’s user interface (UI) is designed to be intuitive and user-friendly,
allowing learners to practice physically to engage their sensorimotor skills, making
the learning more efficient, and receiving instant feedback on their performance
using cosine similarity. The results of a user study, conducted with 10 participants
from the National Federation of the Deaf, showed that the avatar was effective in
improving learners’ BSL skills and was perceived positively in terms of usability
and engagement. Though there is a little research on American Sign Language
avatar tutors or a learning system for American Sign Language, there isn’t any
research conducted on a smart avatar tutor for teaching Bangla Sign Language
(BSL). The proposed system has the potential to make BSL learning more accessible
and enjoyable for a wider audience.