Real time application to identify the facial emotions using Convolutional Neural Network (CNN) and OpenCV

Authors

  • Sahabdeen Aysha Asra Department of Information Technology, Faculty of Humanities and Social Sciences, University of Ruhuna, Sri Lanka.
  • K.H. Ramanayaka Faculty of Humanities and Social Sciences, University of Ruhuna, Sri Lanka.

DOI:

https://doi.org/10.64383/irjss.202501127

Keywords:

Convolutional Neural Network (CNN), Deep Learning Model, Face Emotion Recognition, Python

Abstract

The notion of real-time human emotion identification utilizing Convolutional Neural Network (CNN)-based digital image processing algorithms is proposed in this study. This work offers important literacy calculations that are engaged in face protestation for exact unique confirmation and byline that may efficiently and capably see feelings from the user's vibes. For the purpose of training a model to recognize facial reactions, large datasets are explored and analyzed. A little experiment is done on a variety of women and men of all ages, races, and colors to describe their sentiments, then variations for diverse faces are identified. Existing studies have achieved reasonable accuracy but often lack real-time capability and struggle with variations in lighting and facial orientation. This study bridges that gap by implementing a real-time CNN–OpenCV-based model. To support this study, CNN is utilized along with the Deep Learning model, OpenCV, Tensorflow, Keras, Pandas, and Numpy. This work has been enhanced in three areas: face placement, acknowledgment, and emotion organization. Moreover, computer vision (using a camera) operations were carried out utilizing programs written in Python. An extensive examination is conducted over a lengthy period of time to identify their interior reactions and identify physiological variations for each face in order to show continuous sufficiency. The findings of the tests show how idealized the face investigation framework is. Finally, extremely high accuracy and real-time measurements of programmed face detection and identification were made. The proposed model achieved an overall accuracy of 89% on the FER2013 dataset, demonstrating strong performance in real-time emotion classification. This technique may be used and is very beneficial in many different fields, including colleges, security, schools, and universities and banking.

Author Biographies

Sahabdeen Aysha Asra, Department of Information Technology, Faculty of Humanities and Social Sciences, University of Ruhuna, Sri Lanka.

Ms. Sahabdeen Aysha Asra is a lecturer in the Department of Information Technology, Faculty of Humanities and Social Sciences, University of Ruhuna. Her academic interests include Information Technology education, research, and student-centered learning.

K.H. Ramanayaka, Faculty of Humanities and Social Sciences, University of Ruhuna, Sri Lanka.

Dr. Kokila Ramanayaka is a senior academic at the Faculty of Humanities and Social Sciences, University of Ruhuna. He actively contributes to academic development, research, and student engagement within the faculty.

References

[1] P. E., S. Kamal, S. C. C., and S. M.H., “Facial Emotion Recognition Using Deep Donvolutional Neural Network,” in Proceedings of 2021 2nd International Conference on Intelligent Engineering and Management, ICIEM 2021, 2021, pp. 486–490, doi: 10.1109/ICIEM51511.2021.9445346.

[2] A. Jaiswal, A. K. Raju, and S. Deb, “Facial Emotion Detection Using Deep Learning,” in 2020 International Conference for Emerging Technology (INCET), 2022, vol. 783, pp. 1417–1427, doi: 10.1007/978-981-16-3690-5_136.

[3] K. M. Pathak, S. Yadav, P. Jain, P. Tanwar, and B. Kumar, “A Facial Expression Recognition System to Predict Emotions,” in Proceedings of International Conference on Intelligent Engineering and Management, ICIEM 2020, 2020, pp. 414–419, doi: 10.1109/ICIEM48762.2020.9160229.

[4] R. Matin and D. Valles, “A Speech Emotion Recognition Solution-based on Support Vector Machine for Children with Autism Spectrum Disorder to Help Identify Human Emotions,” 2020, doi: 10.1109/IETC47856.2020.9249147.

[5] Z. Jinnuo, S. B. Goyal, M. Tesfayohanis, and Y. Omar, “Implementation of Artificial Intelligence Image Emotion Detection Mechanism Based on Python Architecture for Industry 4.0,” Journal of Nanomaterials, vol. 2022, 2022, doi: 10.1155/2022/5293248.

[6] A. Martínez, L. M. Belmonte, A. S. García, A. Fernández-Caballero, and R. Morales, “Facial emotion recognition from an unmanned flying social robot for home care of dependent people,” Electronics (Switzerland), vol. 10, no. 7, pp. 1–22, 2021, doi: 10.3390/electronics10070868.

[7] W. Graterol, J. Diaz-Amado, Y. Cardinale, I. Dongo, E. Lopes-Silva, and C. Santos-Libarino, “Emotion detection for social robots based on nlp transformers and an emotion ontology,” Sensors (Switzerland), vol. 21, no. 4, pp. 1–19, 2021, doi: 10.3390/s21041322.

[8] M. Krommyda, A. Rigos, K. Bouklas, and A. Amditis, “Emotion detection in Twitter posts: A rule-based algorithm for annotated data acquisition,” in Proceedings - 2020 International Conference on Computational Science and Computational Intelligence, CSCI 2020, 2020, pp. 257–262, doi: 10.1109/CSCI51800.2020.00050.

[9] J. Sini, A. C. Marceddu, and M. Violante, “Automatic emotion recognition for the calibration of autonomous driving functions,” Electronics (Switzerland), vol. 9, no. 3, 2020, doi: 10.3390/electronics9030518.

[10] L. Schoneveld, A. Othmani, and H. Abdelkawy, “Leveraging recent advances in deep learning for audio-Visual emotion recognition,” Pattern Recognition Letters, vol. 146, pp. 1–7, 2021, doi: 10.1016/j.patrec.2021.03.007.

[11] A. H. Mary, Z. B. Kadhim, and Z. S. Sharqi, “Face Recognition and Emotion Recognition from Facial Expression Using Deep Learning Neural Network,” in IOP Conference Series: Materials Science and Engineering, 2020, vol. 928, no. 3, doi: 10.1088/1757-899X/928/3/032061.

[12] A. Keshri, A. Singh, B. Kumar, D. Pratap, and A. Chauhan, “Automatic Detection and Classification of Human Emotion in Real-Time Scenario,” Journal of ISMAC, vol. 4, no. 1, pp. 41–53, 2022, doi: 10.36548/jismac.2022.1.005.

[13] R. Puri, A. Gupta, M. Sikri, M. Tiwari, N. Pathak, and S. Goel, “Emotion Detection using Image Processing in Python,” in 5th International Conference on “Computing for Sustainable Global Development”, 2020, pp. 1389–1394, [Online]. Available: https://arxiv.org/abs/2012.00659v1.

[14] C. Y. Park et al., “K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations,” Scientific Data, vol. 7, no. 1, pp. 1–16, 2020, doi: 10.1038/s41597-020-00630-y.

[15] P. Jiang, B. Wan, Q. Wang, and J. Wu, “Fast and efficient facial expression recognition using a gabor convolutional network,” IEEE Signal Processing Letters, vol. 27, no. c, pp. 1954–1958, 2020, doi: 10.1109/LSP.2020.3031504.

[16] A. John, M. C. Abhishek, A. S. Ajayan, S. Sanoop, and V. R. Kumar, “Real-time facial emotion recognition system with improved preprocessing and feature extraction,” in Proceedings of the 3rd International Conference on Smart Systems and Inventive Technology, ICSSIT 2020, 2020, no. Icssit, pp. 1328–1333, doi: 10.1109/ICSSIT48917.2020.9214207.

[17] D. Indira, L. Sumalatha, and B. R. Markapudi, “Multi Facial Expression Recognition (MFER) for Identifying Customer Satisfaction on Products using Deep CNN and Haar Cascade Classifier,” in IOP Conference Series: Materials Science and Engineering, 2021, vol. 1074, no. 1, p. 012033, doi: 10.1088/1757-899x/1074/1/012033.

[18] A. Islam Chowdhury, M. Munem Shahriar, A. Islam, E. Ahmed, A. Karim, and M. Rezwanul Islam, “An automated system in ATM booth using face encoding and emotion recognition process,” in ACM International Conference Proceeding Series, 2020, pp. 57–62, doi: 10.1145/3421558.3421567.

[19] R. Zatarain Cabada, H. Rodriguez Rangel, M. L. Barron Estrada, and H. M. Cardenas Lopez, “Hyperparameter optimization in CNN for learning-centered emotion recognition for intelligent tutoring systems,” Soft Computing, vol. 24, no. 10, pp. 7593–7602, 2020, doi: 10.1007/s00500-019-04387-4.

[20] N. Mehendale, “Facial emotion recognition using convolutional neural networks (FERC),” SN Applied Sciences, vol. 2, no. 3, pp. 1–8, 2020, doi: 10.1007/s42452-020-2234-1.

[21] S. A. Hussain and A. Salim Abdallah Al Balushi, “A real time face emotion classification and recognition using deep learning model,” in Journal of Physics: Conference Series, 2020, vol. 1432, no. 1, doi: 10.1088/1742-6596/1432/1/012087.

[22] J. H. Cheong, T. Xie, S. Byrne, and L. J. Chang, “Py-Feat: Python Facial Expression Analysis Toolbox,” 2021, [Online]. Available: http://arxiv.org/abs/2104.03509.

[23] M. F. Ali, M. Khatun, and N. A. Turzo, “Facial Emotion Detection using Neural Network,” International Journal of Advanced Computer Science and Applications, vol. 13, no. 11, pp. 168–173, 2022, doi: 10.14569/IJACSA.2022.0131118.

[24] H. Feng and J. Shao, “Facial Expression Recognition Based on Local Features of Transfer Learning,” Proceedings of 2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference, ITNEC 2020, no. Itnec, pp. 71–76, 2020, doi: 10.1109/ITNEC48623.2020.9084794.

[25] V. Franzoni, G. Biondi, D. Perri, and O. Gervasi, “Enhancing mouth-based emotion recognition using transfer learning,” Sensors (Switzerland), vol. 20, no. 18, pp. 1–15, 2020, doi: 10.3390/s20185222.

[26] A. I. Siam, N. F. Soliman, A. D. Algarni, F. E. Abd El-Samie, and A. Sedik, “Deploying Machine Learning Techniques for Human Emotion Detection,” Computational Intelligence and Neuroscience, vol. 2022, 2022, doi: 10.1155/2022/8032673.

[27] V. Doma and M. Pirouz, “A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals,” Journal of Big Data, vol. 7, no. 1, 2020, doi: 10.1186/s40537-020-00289-7.

[28] A. Chaudhari, C. Bhatt, A. Krishna, and P. L. Mazzeo, “ViTFER: Facial Emotion Recognition with Vision Transformers,” Applied System Innovation, vol. 5, no. 4, 2022, doi: 10.3390/asi5040080.

[29] K. Mohan, A. Seal, O. Krejcar, and A. Yazidi, “FER-net: facial expression recognition using deep neural net,” Neural Computing and Applications, vol. 33, no. 15, pp. 9125–9136, 2021, doi: 10.1007/s00521-020-05676-y.

[30] J. H. Kim, R. Mutegeki, A. Poulose, and D. S. Han, “A Study of a Data Standardization and Cleaning Technique for a Facial Emotion Recognition System,” … of the Korean …, pp. 2–4, 2020, [Online]. Available: https://journal-home.s3.ap-northeast-2.amazonaws.com/site/2020kics/presentation/0109.pdf.

[31] S. Modi and M. H. Bohara, “Facial emotion recognition using convolution neural network,” in Proceedings - 5th International Conference on Intelligent Computing and Control Systems, ICICCS 2021, 2021, no. Iciccs, pp. 1339–1344, doi: 10.1109/ICICCS51141.2021.9432156.

[32] Y. Chen and J. He, “Deep Learning-Based Emotion Detection,” Journal of Computer and Communications, vol. 10, no. 02, pp. 57–71, 2022, doi: 10.4236/jcc.2022.102005.

[33] G. Rafael, H. Kusuma, and Tasripan, “The Utilization of Cloud Computing for Facial Expression Recognition using Amazon Web Services,” in CENIM 2020 - Proceeding: International Conference on Computer Engineering, Network, and Intelligent Multimedia 2020, 2020, pp. 366–370, doi: 10.1109/CENIM51130.2020.9297974.

[34] V. Sreenivas, V. Namdeo, and E. V. Kumar, “Group based emotion recognition from video sequence with hybrid optimization based recurrent fuzzy neural network,” Journal of Big Data, vol. 7, no. 1, 2020, doi: 10.1186/s40537-020-00326-5.

Downloads

Published

2025-11-28

How to Cite

Asra, S. A., & K.H. Ramanayaka. (2025). Real time application to identify the facial emotions using Convolutional Neural Network (CNN) and OpenCV. International Research Journal of Scientific Studies, 2(9), 1–16. https://doi.org/10.64383/irjss.202501127