Show simple item record

dc.contributor.advisorUddin, Dr. Jia
dc.contributor.authorTofa, Kamrun Nahar
dc.contributor.authorAhmed, Farhana
dc.contributor.authorShakil, Arif
dc.date.accessioned2018-02-15T04:12:07Z
dc.date.available2018-02-15T04:12:07Z
dc.date.copyright2017
dc.date.issued12/14/2017
dc.identifier.otherID 14101063
dc.identifier.otherID 14101069
dc.identifier.otherID 14101031
dc.identifier.urihttp://hdl.handle.net/10361/9469
dc.descriptionCataloged from PDF version of thesis report.
dc.descriptionIncludes bibliographical references (pages 33-35).
dc.descriptionThis thesis report is submitted in partial fulfilment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2017.en_US
dc.description.abstractIn this paper, we attempted to propose a model to detect inappropriate scenes, such as nudity, weapons, danger, drugs, gore, etc. in any video stream. The detection part is divided into different steps. The very first step of the proposed model is to convert the video stream into its frames. After fragmentation is completed, the image detection algorithms are used on the individual extracted frames to detect our required inappropriate scenes. For the detection part, we decided to fuse three different algorithms to detect the required inappropriate scenes. The nude detection part is handled using two algorithms. The first algorithm would find human figures on the fragmented frames and if found, the human figures would be cropped out and a separate image file would be formed containing the cropped-out part only. Next, the second algorithm would use this cropped image to find the presence of nudity in the original uncropped image. To detect dangerous objects like knives, guns, swords and detect gore, bloody scenes in the fragmented frames, the object detection algorithm is used. For object detection, we used CNNs Object Detection [1,2] algorithm to detect objects and scenes, and for nudity detection, we have used nudepy library from python which is based on detecting skin-colored pixels and identify nudity based on pixel count and its region [18]. After successful detection via the two algorithms, the model is going to give an output to show the percentage of how much of violent scenes and nudity is present in the video sequence. Furthermore, the model will also be able to determine and classify the video as pornography if the percentage nudity detected is over our base scale of what percentage of nudity in a video may be considered as a pornography.en_US
dc.description.statementofresponsibilityKamrun Nahar Tofa
dc.description.statementofresponsibilityFarhana Ahmed
dc.description.statementofresponsibilityArif Shakil
dc.format.extent35 pages
dc.language.isoenen_US
dc.publisherBRAC Universityen_US
dc.rightsBRAC University thesis reports are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subjectVideo streamen_US
dc.subjectInappropriate sceneen_US
dc.subjectScene detectionen_US
dc.subjectCNNsen_US
dc.titleInappropriate scene detection in a video streamen_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, BRAC University
dc.description.degreeB. Computer Science and Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record