• Login
    • Library Home
    View Item 
    •   BracU IR
    • School of Engineering and Computer Science (SECS)
    • Department of Computer Science and Engineering (CSE)
    • Thesis & Report, BSc (Computer Science and Engineering)
    • View Item
    •   BracU IR
    • School of Engineering and Computer Science (SECS)
    • Department of Computer Science and Engineering (CSE)
    • Thesis & Report, BSc (Computer Science and Engineering)
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    An efficient interpretation model for people with hearing and speaking disabilities

    Thumbnail
    View/Open
    11201045,13201067,13301101_CSE.pdf (762.2Kb)
    Date
    2017-08-22
    Publisher
    BRAC University
    Author
    Sayeed, M. M. Mahmud
    Hossain, Anisha Anjum
    Priya, Samrin
    Metadata
    Show full item record
    URI
    http://hdl.handle.net/10361/8698
    Abstract
    Sign language is a medium of communication between individuals who have hearing and speaking deficiency. Generally they are called Deaf and Mute. To have a better, effective and simpler communication between the vocal and non-vocal society, it is very important to understand their language without difficulty. Previous research works have shown us different types of efficient methods of interaction between these two sets of people. On the down side, however, they tend to focus on one sided conversation only. Our paper focuses on two very important things; one is converting the American Sign Language (ASL) to text word-by-word and the other is audio to gesture & text conversion. We used the microphone of Microsoft Kinect Sensor device; a camera; along with common recognition algorithm to detect, recognize sign language and interpret the interactive hand shape to American Sign Language text. We decided to build efficient system that can be used as an interpreter among the hearing impaired and the normal people. Besides helping as an interpreter, this research may also open doors to numerous other applications like sign language tutorials in the future.
    Keywords
    American sign language; Sign language.; Gesture and text conversion; Gesture and image conversion; Microsoft kinect sensor
     
    Description
    This thesis report is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2017.
     
    Cataloged from PDF version of thesis report.
     
    Includes bibliographical references (page 29-30).
    Department
    Department of Computer Science and Engineering, BRAC University
    Collections
    • Thesis & Report, BSc (Computer Science and Engineering)

    Copyright © 2008-2019 Ayesha Abed Library, Brac University 
    Contact Us | Send Feedback
     

     

    Policy Guidelines

    • BracU Policy
    • Publisher Policy

    Browse

    All of BracU Institutional RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    Copyright © 2008-2019 Ayesha Abed Library, Brac University 
    Contact Us | Send Feedback