Accurate analysis of mood detection using eye-images rather than facial expression recognition (FER)
Abstract
There are several works on mood detection by machine learning from physical and neuro- physical data of people, along with works on emotion recognition using eye-tracking. We want to show that a person’s mood can be detected using their eye images only. The mood is reflected through one’s eyes. The goal is to establish a connection between an individual’s mood and one’s eye images. The machine learning algorithm that we are going to use is Convolutional Neural Network (CNN) because it does not require external feature extraction. They system learns to extract the features by itself. In this paper, we developed two CNN models and used FER- 2013 as our dataset from which we used only the eye images for each of the six emotions: happy, fear, sad, angry, neutral and surprise to create our own dataset. We trained and tested our models with both FER-2013 dataset as well as our own dataset and compared the results. For FER-2013 dataset, our final accuracy score for model 1 was 83.78% with a validation accuracy score of 65.35%. It was seen that our model 1 showed the final accuracy score of 69.19% with a validation accuracy of 72.08% whereas for model 2, the final accuracy was 66.55% with a validation accuracy of 72.36% when trained and tested with our own dataset. The low accuracy for our dataset is due to the limitations that we faced for insufficient training and testing images. The accuracy can be improved with a better dataset for training our models.