Blurring of inappropriate scenes in a video using image processing
Abstract
Inappropriate scenes such as bloody scenes, nudity, gore, drugs, weapons etc. are considered
inappropriate especially in a developing Muslim country like Bangladesh. In our country,
such scenes are very discouraged for children, old people or heart patients. Thus, keeping
these in mind we propose a model where these inappropriate scenes will be detected and
blurred from any video stream. Moreover, the model also shows percentage of explicitly
in an input video file. As a result, people playing the video will know beforehand whether
they would want to watch it or not and parents will have a greater control on what their
children are watching. For nudity detection, there will be fragmented human figures that will
be extracted and then the fragments will be compared against a database to decide whether
nudity is involved or not. If it is involved and exceeds a predetermined threshold, then the
video will be considered as pornography that many people may not prefer to watch. Similarly,
the extracted figures of objects or gore scenes will be compared against a database to know
the percentage of inappropriate scene in a video. Bringing both the nudity and goriness under
one roof, we named the term explicit and if a video is explicit, the user will have the option
of knowing it from beforehand and blur out any portion from the video automatically by
using our model. The accuracy of our model is 93% and the algorithm we have used in this
paper is CNN.