Show simple item record

dc.contributor.advisorHossain, Dr. Muhammad Iqbal
dc.contributor.authorRahman, Masroor
dc.contributor.authorNavid, Reshad Karim
dc.contributor.authorHossain Bhuyain, Md Muballigh
dc.contributor.authorHasan, Farnaz Fawad
dc.contributor.authorNup, Naima Ahmed
dc.date.accessioned2023-08-08T05:20:49Z
dc.date.available2023-08-08T05:20:49Z
dc.date.copyright2023
dc.date.issued2023-01
dc.identifier.otherID: 19101213
dc.identifier.otherID: 19101225
dc.identifier.otherID: 19101289
dc.identifier.otherID: 19101579
dc.identifier.otherID: 19101430
dc.identifier.urihttp://hdl.handle.net/10361/19351
dc.descriptionThis thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2023.en_US
dc.descriptionCataloged from PDF version of thesis.
dc.descriptionIncludes bibliographical references (pages 41-43).
dc.description.abstractThe use of machine learning models has greatly enhanced the capability to rec ognize patterns and draw conclusions. However, due to their black-box nature, it can be difficult to comprehend the factors that affect their decisions. XAI methods offer transparency into these models and aid in enhancing comprehension, exami nation, and trust in their outcomes. In this paper, we present a study on the use of machine learning (ML) models for intrusion detection in Windows 10 Operating systems using the ToN-IoT dataset. We investigate the performance of different ML models including tree-based models such as Decision Tree (DT), Random Forest (RF), Logistic Regression (LR), and K-Nearest Neighbors (KNN) in detecting these attacks. Furthermore, we use Explainable Artificial Intelligence (XAI) techniques to understand how the attacks influence the processes in the Windows 10 systems and how they can be identified and prevented. Our study highlights the importance of using XAI techniques to make ML models more interpretable and trustworthy in high-stakes applications such as intrusion detection. We believe that this work can contribute to the development of more robust and secure operating systems.en_US
dc.description.statementofresponsibilityMasroor Rahman
dc.description.statementofresponsibilityReshad Karim Navid
dc.description.statementofresponsibilityMd Muballigh Hossain Bhuyain
dc.description.statementofresponsibilityFarnaz Fawad Hasan
dc.description.statementofresponsibilityNaima Ahmed Nup
dc.format.extent43 pages
dc.language.isoenen_US
dc.publisherBrac Universityen_US
dc.rightsBrac University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subjectMachine learningen_US
dc.subjectExplainable Artificial Intelligence (XAI)en_US
dc.subjectToN-IoTen_US
dc.subjectWindows OSen_US
dc.subjectData analysisen_US
dc.subjectIntrusion detectionen_US
dc.subject.lcshNeural networks (Computer science)
dc.subject.lcshArtificial intelligence
dc.subject.lcshMachine learning
dc.titleExploring the intersection of machine learning and explainable artificial intelligence: An analysis and validation of ML models through XAI for intrusion detectionen_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, Brac University
dc.description.degreeB. Computer Science and Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record