Show simple item record

dc.contributor.advisorAlam, Dr. Md. Ashraful
dc.contributor.authorAurini, Maliha Tasnim
dc.contributor.authorIslam, Shitab Mushfiq-ul
dc.date.accessioned2018-11-14T10:50:51Z
dc.date.available2018-11-14T10:50:51Z
dc.date.copyright2018
dc.date.issued2018
dc.identifier.otherID 14101051
dc.identifier.otherID 14101088
dc.identifier.urihttp://hdl.handle.net/10361/10850
dc.descriptionThis thesis is submitted in partial fulfilment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2018.en_US
dc.descriptionCataloged from PDF version of thesis.
dc.descriptionIncludes bibliographical references (pages 38-41).
dc.description.abstractThe 360° images or regular 2D images look appealing in Virtual Reality yet they fail to represent depth and how the depth can be used to give an experience to the user from two dimensional images. We proposed an approach for creating stereogram from computer generated depth map using approximation algorithm and later use these stereo pairs for giving a complete experience on VR along with forward and backward navigation using mobile sensors. Firstly the image is being segmented into two images from which we generated our disparity map and afterwards generate the depth image from it. After the creation of the depth image, stereo pair which is the left and right image for the eyes were created. Acquired image from the previous process then handled by Cardboard SDK for VR support used in the Android devices using Google Cardboard headset. With the VR image in the stereoscopic device, we use the accelerometer sensor of the device to determine the movement of the device while head mounted. Unlike the other VR navigation systems offered (HTC Vibe, Oculus) using external sensors, our approach is to use the built-in sensors for motion processing. Using the accelerometer reading from the movement, the user will be able to move around virtually in the constructed image. The results of this experiment are the visual changes of the image displayed in VR according to the viewer’s physical movement.en_US
dc.description.statementofresponsibilityMaliha Tasnim Aurini
dc.description.statementofresponsibilityShitab Mushfiq-ul Islam
dc.format.extent41 pages
dc.language.isoenen_US
dc.publisherBRAC Universityen_US
dc.rightsBRAC University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subject3Den_US
dc.subjectMotion processingen_US
dc.subjectSmart phoneen_US
dc.subjectVirtual realityen_US
dc.subjectStereogramen_US
dc.subjectDepth mapen_US
dc.subjectAccelerometer sensoren_US
dc.subjectNavigationen_US
dc.subject.lcshThree-dimensional display systems.
dc.title3D Visualization of 2D/360° image and navigation in virtual reality through motion processing via smart phone sensorsen_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, BRAC University
dc.description.degreeB. Computer Science and Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record