dc.contributor.advisor | Alam, Dr. Md. Ashraful | |
dc.contributor.author | Aurini, Maliha Tasnim | |
dc.contributor.author | Islam, Shitab Mushfiq-ul | |
dc.date.accessioned | 2018-11-14T10:50:51Z | |
dc.date.available | 2018-11-14T10:50:51Z | |
dc.date.copyright | 2018 | |
dc.date.issued | 2018 | |
dc.identifier.other | ID 14101051 | |
dc.identifier.other | ID 14101088 | |
dc.identifier.uri | http://hdl.handle.net/10361/10850 | |
dc.description | This thesis is submitted in partial fulfilment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2018. | en_US |
dc.description | Cataloged from PDF version of thesis. | |
dc.description | Includes bibliographical references (pages 38-41). | |
dc.description.abstract | The 360° images or regular 2D images look appealing in Virtual Reality yet they fail to represent depth and how the depth can be used to give an experience to the user from two dimensional images. We proposed an approach for creating stereogram from computer generated depth map using approximation algorithm and later use these stereo pairs for giving a complete experience on VR along with forward and backward navigation using mobile sensors. Firstly the image is being segmented into two images from which we generated our disparity map and afterwards generate the depth image from it. After the creation of the depth image, stereo pair which is the left and right image for the eyes were created. Acquired image from the previous process then handled by Cardboard SDK for VR support used in the Android devices using Google Cardboard headset. With the VR image in the stereoscopic device, we use the accelerometer sensor of the device to determine the movement of the device while head mounted. Unlike the other VR navigation systems offered (HTC Vibe, Oculus) using external sensors, our approach is to use the built-in sensors for motion processing. Using the accelerometer reading from the movement, the user will be able to move around virtually in the constructed image. The results of this experiment are the visual changes of the image displayed in VR according to the viewer’s physical movement. | en_US |
dc.description.statementofresponsibility | Maliha Tasnim Aurini | |
dc.description.statementofresponsibility | Shitab Mushfiq-ul Islam | |
dc.format.extent | 41 pages | |
dc.language.iso | en | en_US |
dc.publisher | BRAC University | en_US |
dc.rights | BRAC University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. | |
dc.subject | 3D | en_US |
dc.subject | Motion processing | en_US |
dc.subject | Smart phone | en_US |
dc.subject | Virtual reality | en_US |
dc.subject | Stereogram | en_US |
dc.subject | Depth map | en_US |
dc.subject | Accelerometer sensor | en_US |
dc.subject | Navigation | en_US |
dc.subject.lcsh | Three-dimensional display systems. | |
dc.title | 3D Visualization of 2D/360° image and navigation in virtual reality through motion processing via smart phone sensors | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | Department of Computer Science and Engineering, BRAC University | |
dc.description.degree | B. Computer Science and Engineering | |