Facial Emotion-Based Stress Detection Using CNN and Haar Cascade Algorithms

Volume: 10 | Issue: 02 | Year 2024 | Subscription
International Journal of Microwave Engineering and Technology
Received Date: 10/26/2024
Acceptance Date: 11/06/2024
Published On: 2024-11-25
First Page: 1
Last Page: 11

Journal Menu

By: M. Aswini, Gude Tejaswani, Maddi Sahithi, Boddu Hema Harshitha, and Bellamkonda Mohana Vyshnavi

1. M. Aswini*, Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for Women, Visakhapatnam, Andhra Pradesh, India
2. Gude Tejaswani, Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for Women, Visakhapatnam, Andhra Pradesh, India
3. Maddi Sahithi, Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for Women, Visakhapatnam, Andhra Pradesh, India
4. Boddu Hema Harshitha, Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for Women, Visakhapatnam, Andhra Pradesh, India
5. Bellamkonda Mohana Vyshnavi, Assistant Professor, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for Women, Visakhapatnam, Andhra Pradesh, India

Abstract

Understanding human conduct requires the ability to recognize facial emotions, which has applications in everything from human–computer interaction to psychological wellness monitoring. This research provides a new approach to stress detection using convolutional neural networks and Haar Cascade classifiers. The suggested method uses convolutional neural network to recognize facial expressions and Haar Cascade algorithm for face detection. The methodology begins with preliminary processing of the input photos, followed by face detection and extraction of facial regions. Those parts are then fed into the convolutional neural network model, which classifies emotions. The system has been trained and tested on publicly available datasets, with encouraging results in stress detection accuracy. This method, which detects stress through facial expressions, has potential uses in stress management, mental health evaluation, and personalized therapies. Face expressions have an important part in transmitting emotions, especially stress, which is a common problem in today’s fast-paced world. This research provides a novel approach for detecting stress by analyzing facial expressions with convolutional neural networks and Haar Cascade classifiers. The proposed system enhances the precision and effectiveness of stress detection by combining the benefits of both approaches. The methodology begins by preprocessing the input photos to improve their quality and normalize them for subsequent analysis. Haar Cascade classifiers are then used to detect faces in the images, ensuring precise identification of facial regions even under different lighting conditions and orientations. The faces discovered are removed and resized to produce homogeneous inputs for further processing.

Loading

Citation:

How to cite this article: M. Aswini, Gude Tejaswani, Maddi Sahithi, Boddu Hema Harshitha, and Bellamkonda Mohana Vyshnavi, Facial Emotion-Based Stress Detection Using CNN and Haar Cascade Algorithms. International Journal of Microwave Engineering and Technology. 2024; 10(02): 1-11p.

How to cite this URL: M. Aswini, Gude Tejaswani, Maddi Sahithi, Boddu Hema Harshitha, and Bellamkonda Mohana Vyshnavi, Facial Emotion-Based Stress Detection Using CNN and Haar Cascade Algorithms. International Journal of Microwave Engineering and Technology. 2024; 10(02): 1-11p. Available from:https://journalspub.com/publication/ijmet/article=13876

Refrences:

  1. Ekman P, Friesen WV. Facial action coding system. Environmental Psychology & Nonverbal Behavior. 1978 Jan 2.
  2. Martinez B, Valstar MF, Jiang B, Pantic M. Automatic analysis of facial actions: a survey. IEEE Trans Affect Comput. 2019; 10(3):325–47. doi: 10.1109/TAFFC.2017.2731763.
  3. Giannakakis G, Pediaditis M, Manousos D, Kazantzaki E, Chiarugi F, Simos P, et al. Stress and anxiety detection using facial cues from videos. Biomed Sig Process Cont. 2017; 31:89–101.
  4. Bevilacqua F, Engstrom H, Backlund P. Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games. Int J Comp Game Tech. 2018;2018(1):8734540.
  5. Daudelin-Peltier C, Forget H, Blais C, Deschênes A, Fiset D. The effect of acute social stress on the recognition of facial expression of emotions. Sci Rep.2017;7(1):1036. doi: https://doi.org/10.1038/s41598-017-01053-3
  6. Ekman P, Friesen W. Facial action coding system (FACS): manual. 1978.
  7. Viola P, Jones MJ. Robust real-time face detection. Int J Comp Vis.2004;57:137–54. doi: https://doi.org/10.1023/B:VISI.0000013087.49260.fb
  8. Matthews I, Baker S. Active appearance models revisited. Int J Comp Vis. 2004;60(2):135–64.
  9. Watson D. Contouring: a guide to the analysis and display of spatial data. Elsevier; 2013
  10. Baltru Saitis T, Mahmoud M, Robinson P. Cross-dataset learning and person-specific normalization for automatic action unit detection. In: Automatic Face and Gesture Recognition (FG), 2015 11th IEEE International Conference and Workshops on, Vol. 6. IEEE, 2015, pp. 1–6.
  11. Cootes TF, Taylor CJ. Active shape models smart snakes, in BMVC 92. Springer; 1992. pp. 266–75.
  12. Carcagn` I, Coco MD, M. Leo, C. Distante C. Facial expression recognition and histograms of oriented gradients: a comprehensive study. Springer Plus. 2015;4(1):645.
  13. Smola J, Scholkopf B. A tutorial on support vector regression. Stat Comp. 2004;14(3):199–222.
  14. Lucey P, Cohn JF, Prkachin KM, Solomon PE, Matthews I. Painful Data: The unbc-mcmaster shoulder pain expression archive database, in Automatic Face Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on. IEEE, 2011. pp. 57–64.
  15. Savran A, Alyuz N, Dibeklio˘glu H, Celiktutan O, Gokberk B, Sankur B, et al. Bosphorus database for 3d face analysis, in European Workshop on Biometrics and Identity Management. Springer; 2008. pp. 47–56.
  16. Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data. J Bioinform Comput Biol. 2005;3(02):185–205.
  17. Gulgezen G, Cataltepe Z, Yu L. Stable and accurate feature selection. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, 2009, pp. 455–68.
  18. Andreu Y, Chiarugi F, Colantonio S, Giannakakis G, Giorgi D, Henriquez P, et al. Wize mirror- a smart, multisensory cardio-metabolic risk monitoring system. Comp Vis Image Understand. 2016;148:3–22.
  19. Giannakakis G, Manousos D, Chaniotakis V, Tsiknakis M. Evaluation of head pose features for stress detection and classification. In: 2018 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), 2018, pp. 406–9.
  20. Anis K, Zakia H, Mohamed D, Jeffrey C. Detecting depression severity by interpretable representations of motion dynamics, in 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG2018), 2018, pp. 739–745.