dc.contributor.advisor | Alam, Md. Golam Rabiul | |
dc.contributor.advisor | Hossain, Muhammad Iqbal | |
dc.contributor.author | Mohiuzzaman, Md. | |
dc.contributor.author | Abedin, Ahmad Abrar | |
dc.contributor.author | Rahman, Shadab Afnan | |
dc.contributor.author | Chowdhury, Shafaq Arefin | |
dc.contributor.author | Ahmed, Shahadat | |
dc.date.accessioned | 2025-02-04T03:58:43Z | |
dc.date.available | 2025-02-04T03:58:43Z | |
dc.date.copyright | ©2024 | |
dc.date.issued | 2024-10 | |
dc.identifier.other | ID 20301361 | |
dc.identifier.other | ID 20201080 | |
dc.identifier.other | ID 21101076 | |
dc.identifier.other | ID 21101064 | |
dc.identifier.other | ID 20301481 | |
dc.identifier.uri | http://hdl.handle.net/10361/25284 | |
dc.description | This thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science, 2024. | en_US |
dc.description | Cataloged from PDF version of thesis. | |
dc.description | Includes bibliographical references (pages 32-33). | |
dc.description.abstract | Federated Learning (FL) is a decentralized machine learning paradigm that enables
training a global model across numerous edge devices while preserving data privacy.
However, FL faces significant challenges, particularly in environments with heterogeneous
hardware capabilities, communication burdens, and constrained resources.
In this paper, we introduce a novel framework, TRI-FED-RKD, which incorporates
forward and reverse knowledge distillation (RKD) along with FedAvg using
a hybrid architecture of convolutional neural networks (CNNs) and spiking neural
networks (SNNs). Our approach employs a tri-layer hierarchical aggregation-based
architecture consisting of client devices, intermediate (middle) servers, and a global
server. We compared two federated architectures: standard federated learning and
federated learning with forward and reverse distillation in a hierarchical setting
(TRI-FED-RKD). The same model is used across several datasets to evaluate the
architectures, not the model performance. Depending on the use case, the network
administrator can pick their own teacher and student models. The teacher model
can also be different for each client if needed. This means that our architecture can
deal with model heterogeneity when it comes to teacher models. We evaluate TRIFED-
RKD on neuromorphic datasets such as DVS Gesture and NMNIST. We also
tested it using non-neuromorphic datasets such as MNIST, EMNIST, and CIFAR10.
Furthermore, we have shown that using forward and reverse knowledge distillation
in federated learning can lead to much better performance than federated learning
without knowledge distillation for non-neuromorphic datasets. | en_US |
dc.description.statementofresponsibility | Md. Mohiuzzaman | |
dc.description.statementofresponsibility | Ahmad Abrar Abedin | |
dc.description.statementofresponsibility | Shadab Afnan Rahman | |
dc.description.statementofresponsibility | Shafaq Arefin Chowdhury | |
dc.description.statementofresponsibility | Shahadat Ahmed | |
dc.format.extent | 42 pages | |
dc.language.iso | en | en_US |
dc.publisher | BRAC University | en_US |
dc.rights | BRAC University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. | |
dc.subject | Federated learning | en_US |
dc.subject | Tri-layer architecture | en_US |
dc.subject | SNN | en_US |
dc.subject | Spiking neural networks | en_US |
dc.subject | Dynamic vision sensor | en_US |
dc.subject | DVS | en_US |
dc.subject | Neuromorphic datasets | en_US |
dc.subject | Knowledge distillation | en_US |
dc.subject | CNN | |
dc.subject | Convolutional neural network | |
dc.subject.lcsh | Neural networks (Computer science). | |
dc.subject.lcsh | Computer network architectures. | |
dc.subject.lcsh | Deep learning (Machine learning). | |
dc.title | TRI-FED-RKD: integrating forward-reverse distillation with SNN and CNN within federated learning using tri layer hierarchical aggregation based architecture | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | Department of Computer Science and Engineering, BRAC University | |
dc.description.degree | B.Sc. in Computer Science | |