Show simple item record

dc.contributor.advisorRasel, Annajiat Alim
dc.contributor.authorIslam, Md. Farhadul
dc.contributor.authorZabeen, Sarah
dc.contributor.authorBin Rahman, Fardin
dc.contributor.authorIslam, Md. Azharul
dc.contributor.authorBin Kibria, Fahmid
dc.date.accessioned2024-01-17T08:21:11Z
dc.date.available2024-01-17T08:21:11Z
dc.date.copyright2023
dc.date.issued2023-01
dc.identifier.otherID: 22341042
dc.identifier.otherID: 19241004
dc.identifier.otherID: 20101592
dc.identifier.otherID: 19301257
dc.identifier.otherID: 19201063
dc.identifier.urihttp://hdl.handle.net/10361/22179
dc.descriptionThis thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science, 2023.en_US
dc.descriptionCataloged from PDF version of thesis.
dc.descriptionIncludes bibliographical references (pages 82-96).
dc.description.abstractDeep learning technologies developed at an exponential rate throughout the years. Starting from Convolutional Neural Networks (CNNs) to Involutional Neural Net works (INNs), there are several neural network (NN) architectures today, including Vision Transformers (ViT), Graph Neural Networks (GNNs), Recurrent Neural Net works (RNNs) etc. However, uncertainty cannot be represented in these architec tures, which poses a significant difficulty for decision-making given that capturing the uncertainties of these state-of-the-art NN structures would aid in making spe cific judgments. Dropout is one method that may be implemented within Deep Learning (DL) networks as a technique to assess uncertainty. Dropout is applied at the inference phase to measure the uncertainty of these neural network models. This approach, commonly known as Monte Carlo Dropout (MCD), works well as a low-complexity estimation to compute uncertainty. MCD is a widely used approach to measure uncertainty in DL models, but majority of the earlier works focus on only a particular application. Furthermore, there are many state-of-the-art (SOTA) NNs that remain unexplored, with regards to that of uncertainty evaluation. There fore an up-to-date roadmap and benchmark is required in this field of study. Our study revolved around a comprehensive analysis of the MCD approach for assessing model uncertainty in neural network models with a variety of datasets. Besides, we include SOTA NNs to explore the untouched models regarding uncertainty. In addition, we demonstrate how the model may perform better with less uncertainty by modifying NN topologies, which also reveals the causes of a model’s uncertainty. Using the results of our experiments and subsequent enhancements, we also discuss the various advantages and costs of using MCD in these NN designs. While working with reliable and robust models we propose two novel architectures, which provide outstanding performances in medical image diagnosis.en_US
dc.description.statementofresponsibilityMd. Farhadul Islam
dc.description.statementofresponsibilitySarah Zabeen
dc.description.statementofresponsibilityFardin Bin Rahman
dc.description.statementofresponsibilityMd. Azharul Islam
dc.description.statementofresponsibilityFahmid Bin Kibria
dc.format.extent96 pages
dc.language.isoenen_US
dc.publisherBrac Universityen_US
dc.rightsBrac University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subjectDeep learningen_US
dc.subjectNeural networken_US
dc.subjectMonte carlo dropouten_US
dc.subjectUncertaintyen_US
dc.subject.lcshNeural network.
dc.titleAnalysis of uncertainty in different neural network structures using monte carlo dropouten_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, Brac University
dc.description.degreeB.Sc. in Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record