dc.contributor.advisor | Uddin, Jia | |
dc.contributor.author | Tan, Tamkin Mahmud | |
dc.contributor.author | Mondol, Anna Mary | |
dc.contributor.author | Nawal, Noshin | |
dc.contributor.author | Ahmed, Sabbir | |
dc.date.accessioned | 2020-01-20T05:23:14Z | |
dc.date.available | 2020-01-20T05:23:14Z | |
dc.date.copyright | 2019 | |
dc.date.issued | 2019-08 | |
dc.identifier.other | ID 15301040 | |
dc.identifier.other | ID 15301056 | |
dc.identifier.other | ID 15301077 | |
dc.identifier.other | ID 15301079 | |
dc.identifier.uri | http://hdl.handle.net/10361/13638 | |
dc.description | This thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science, 2019. | en_US |
dc.description | Cataloged from PDF version of thesis. | |
dc.description | Includes bibliographical references (pages 34-36). | |
dc.description.abstract | Sign language is used by hearing and speech impaired people to transmit their
messages to other people but it is difficult for a regular people to understand this
gesture based language. Instantaneous responses on sign language can significantly
enhance the understanding of sign language. In this paper, we propose a system
that detects Bangla Sign Language using a digital motion sensor called Leap Motion
Controller. It is a sensor or device which can detect 3D motion of hands, fingers and
finger like objects without any contact. A Sign Language Recognition system has
to be designed to recognize a hand gesture. In sign language system, gestures are
defined as some specific patterns or movement of the hands to give an expression.
There has to be a library which includes all the datasets to match with the user given
gestures. We have to compare the sequences of data we get from Leap Motion and
our datasets to get an optimal result which is basically the output. It will then show
the output as text in the display. For our system, we choose to use $P Point-Cloud
Recognizer algorithm to match the input data with our datasets. This recognition
algorithm was designed for rapid prototyping of gesture-based UI and can deliver
an average over 99% accuracy in user-dependent testing. Our proposed model is
designed in a way so that the hearing and speech impaired people can communicate
easily and efficiently with common people. | en_US |
dc.description.statementofresponsibility | Tamkin Mahmud Tan | |
dc.description.statementofresponsibility | Anna Mary Mondol | |
dc.description.statementofresponsibility | Noshin Nawal | |
dc.description.statementofresponsibility | Sabbir Ahmed | |
dc.format.extent | 36 pages | |
dc.language.iso | en | en_US |
dc.publisher | Brac University | en_US |
dc.rights | Brac University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. | |
dc.subject | Bangla sign language | en_US |
dc.subject | Leap motion controller | en_US |
dc.subject | Machine learning | en_US |
dc.subject | HCI | en_US |
dc.subject | Greedy cloud match | en_US |
dc.subject | Gesture recognition | en_US |
dc.title | Bangla sign language recognition using leap motion sensor | en_US |
dc.type | Thesis | en_US |
dc.description.degree | B. Computer Science | |