Comparative analysis and implementation of AI algorithms and NN model in process scheduling algorithm
Date
2022-09-28Publisher
Brac UniversityAuthor
Niloy, MaharshiMoni, Md. Moynul Asik
Khan, Farah Jasmin
Chowdhury, Aquibul Haq
Juboraj, Md. Fahmid-Ul-Alam
Metadata
Show full item recordAbstract
Process scheduling is an integral part of operating systems. The most widely used
scheduling algorithm in operating systems is round-robin (RR), but the average
waiting time in RR is often quite long. The purpose of this study is to propose
a new algorithm to minimize waiting time and process starvation by determining
the optimal time quantum by predicting CPU burst time. For burst time prediction, we are using the machine learning algorithms decision tree (DT), k-nearest
neighbors (KNN), linear regression (LR) and Neural Network Model Multi-Layer
perceptron-MLP. Finally, the obtained accuracy for burst time prediction of DT
is 98.64%, KNN is 17.1%, LR is 97.96% and using MLP is 26.01%. Moreover,
for 10000 predicted(burst time) processes with the same configuration the average turnaround time (avg TT), the average wait time (avg WT) and the number
of context switches (CS) of the proposed algorithm are consecutively 40331930.48,
40312117.96 and 20002, whereas Traditional Round Robin (RR) has 87194390.98
(avg TT), 87174578.46 (avg WT) and 28964 (CS). Self-Adjustment Round Robin
(SARR) has 72398064.70 (avg TT), 72378252.18 (avg WT) and 39956 (CS). Modi-
fied Round Robin Algorithm (MRRA) has 84924105.36 (avg TT), 84904292.84 (avg
WT) and 5208 (CS) and Optimized Round Robin (ORR) has 78508779.73 (avg TT),
78488967.20 (avg WT) and 22470 (CS). Therefore, it is clear that the proposed algo-
rithm is almost 2 times faster than the other algorithm in terms of process scheduling
under a huge load of processes.