Show simple item record

dc.contributor.advisorAlam, Golam Rabiul
dc.contributor.authorIslam, Md Rashidul
dc.date.accessioned2024-06-04T05:37:07Z
dc.date.available2024-06-04T05:37:07Z
dc.date.copyright©2023
dc.date.issued2023-09
dc.identifier.otherID 20366008
dc.identifier.urihttp://hdl.handle.net/10361/23110
dc.descriptionThis thesis is submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering, 2023.en_US
dc.descriptionCataloged from PDF version of thesis.
dc.descriptionIncludes bibliographical references (pages 42-45).
dc.description.abstractPredicting and understanding traffic patterns have become important objectives for maintaining the Quality of Service (QoS) standard in network management. This change stems from analyzing the data usage on cellular internet networks. Cellular network optimiser frequently employ a variety of data traffic prediction algorithms for this reason. Traditional traffic projections are often made at the high-level or generously large regional cluster level and therefore has the lacking in precised forecation. Furthermore, it is difficult to obtain information on eNodeB-level utilisation with regard to traffic predictions. As a result, using the conventional approach causes user experience degradation or unnecessary network expansion. Developing a traffic forecasting model with the aid of multivariate feature inputs and deep learning techniques was one of the objective of this research. It deals with extensive 6.2 million real network time series LTE data traffic and other associated characteristics, including eNodeB-wise PRB utilisation. A cutting-edge fusion model based on Deep Learning algorithms is suggested. Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM), and Gated Recurrent Unit (GRU) are three deep learning algorithms that when combined allow for eNodeB-level traffic forecasting and eNodeB-wise anticipated PRB utilisation.The proposed fusion model’s R2 score is 0.8034, outperforms the conventional state-if-the-art models. This study also proposed a unique method that thoroughly examines individual nodes for the Smart Network Monitor. This approach follows adjustments made to soft capacity parameters at the eNodeB level, aiming for immediate improvement or long-term network growth to meet a consistent QoS standard. The algorithm relies on expected PRB utilization.en_US
dc.description.statementofresponsibilityMd Rashidul Islam
dc.format.extent55 pages
dc.language.isoenen_US
dc.publisherBrac Universityen_US
dc.rightsBrac University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission.
dc.subjectLTE networksen_US
dc.subjectMachine learning
dc.subjectDeep learning
dc.subjectMobile network capacity
dc.subjectResource management
dc.subject.lcshLong-Term Evolution (Telecommunications)
dc.subject.lcshMachine learning
dc.subject.lcshDeep learning
dc.subject.lcshResource management
dc.titleA deep dive into node-level analysis with fusion RNN model for smart LTE network monitoringen_US
dc.typeThesisen_US
dc.contributor.departmentDepartment of Computer Science and Engineering, Brac University
dc.description.degreeM.Sc. in Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record