Show simple item record

dc.contributor.advisorReza, Md Tanzim
dc.contributor.authorRakshit, Rakesh
dc.contributor.authorTusher, Mohammed Fackruddin
dc.contributor.authorMoyen, Mobashir Mahmud Faisal
dc.contributor.authorTahmid, MD. Rahik
dc.date.accessioned2025-01-14T05:12:49Z
dc.date.available2025-01-14T05:12:49Z
dc.date.copyright©2024
dc.date.issued2024-10
dc.identifier.otherID 19101588
dc.identifier.otherID 19301120
dc.identifier.otherID 19301116
dc.identifier.otherID 19301269
dc.identifier.urihttp://hdl.handle.net/10361/25153
dc.descriptionThis thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science, 2024.en_US
dc.descriptionCataloged from PDF version of thesis.
dc.descriptionIncludes bibliographical references (pages 41-44).
dc.description.abstractThis research explores fusion of deep learning for sign language recognition and thermal imaging, focused on advancing communication systems for the hearing impaired. Recognition of sign language is a crucial technology for improving accessibility and the use of thermal images light up new possibilities for gesture or sign recognition in different lighting conditions.To address the challenges associated with recognizing Bangla sign language, we have collected and built a novel thermal image dataset using the ABF Astron Infrared Camera F3.20. Additionally, we created 49 distinct classes for Bangla sign gestures using a colormap filtered version of the dataset to enhance feature visibility. The study employed transfer learning to retrain pre existing neural networks on both the colormap and thermal datasets. Three prominent deep learning models, ResNet, DenseNet, and Inception, were selected for this task due to their proven effectiveness in image classification tasks.These models were first trained on the colormap dataset. Subsequently, we tested the models on the original thermal dataset, using transfer learning to refine the learned features. This process showed the potential for improved gesture recognition when utilizing both thermal and color mapped images.The results highlight the importance of combining transfer learning with advanced neural networks to enhance recognition systems. With accuracies of 95%, 90%, and 98% across different models, the findings demonstrate the promising application of thermal imaging in improving the reliability and accessibility of sign language recognition technologies. This approach could offer more sturdy solutions for real-time sign language translation, particularly in challenging environmental conditions.en_US
dc.description.statementofresponsibilityRakesh Rakshit
dc.description.statementofresponsibilityMohammed Fackruddin Tusher
dc.description.statementofresponsibilityMobashir Mahmud Faisal Moyen
dc.description.statementofresponsibilityMD. Rahik Tahmid
dc.format.extent56 pages
dc.language.isoenen_US
dc.publisherBrac Universityen_US
dc.rightsBrac University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subjectSign languageen_US
dc.subjectThermal imagingen_US
dc.subjectCNNen_US
dc.subjectConvolutional neural networken_US
dc.subjectBengali languageen_US
dc.subjectTransfer learning
dc.subject.lcshSign language--Bengali language.
dc.subject.lcshNeural network (Computer sciences).
dc.subject.lcshHuman-computer interaction.
dc.titleImproving Bangla sign language detection from thermal imagery: leveraging thermal heatmap-induced transfer learning and deep neural networksen_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, Brac University
dc.description.degreeB.Sc. in Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record