By: M. Aswini, Gude Tejaswani, Maddi Sahithi, Boddu Hema Harshitha, and Bellamkonda Mohana Vyshnavi
1. M. Aswini,Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for women, Visakhapatnam, India
2.Gude Tejaswani,Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for women, Visakhapatnam, India
3. Maddi Sahithi,Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for women, Visakhapatnam, India
4.Boddu Hema Harshitha,Student, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for women, Visakhapatnam, India
5. Bellamkonda Mohana Vyshnavi,Assistant Professor, Department of Computer Science Engineering, Gayatri Vidya Parishad College of Engineering for women, Visakhapatnam, India
Understanding human conduct requires the ability to recognize facial emotions, which has applications in everything from human-computer interaction to psychological wellness monitoring. This research provides a new approach to stress detection using Convolutional Neural Networks (or CNNs) and Haar Cascade classifiers. The suggested method uses CNN to recognize facial expressions and Haar Cascade algorithm for face detection. The methodology begins with preliminary processing the input photos, followed by face detection and extraction of facial regions. Those parts are then fed into the CNN model, which classifies emotions. The system has been trained and tested on publicly available datasets, with encouraging results in stress detection accuracy. This method, which detects stress through facial expressions, has potential uses in stress management, mental health evaluation, and personalized therapies.
Keywords: Convolutional Neural Networks, Haar Cascade classifier, Emotion transmission, Face recognition, Signal Processing, Stress detection, Real-time facial expression.
Citation:
Refrences:
- Ekman P, Friesen WV. Facial action coding system. Environmental Psychology & Nonverbal Behavior. 1978 Jan 2.
- Martinez, M. F. Valstar, B. Jiang and M. Pantic, “Automatic Analysis of Facial Actions: A Survey,” in IEEE Transactions on Affective Computing, vol. 10, no. 3, pp. 325-347, 1 July-Sept. 2019, doi: 10.1109/TAFFC.2017.2731763.
- Giannakakis, M. Pediaditis, D. Manousos, E. Kazantzaki, F. Chiarugi, P. Simos, K. Marias, and M. Tsiknakis, Stress and anxiety detection using facial cues from videos, Biomedical Signal Processing and Control, vol. 31, pp. 89–101, January 2017.
- Bevilacqua, H. Engstrom, and P. Backlund, Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games, International Journal of Computer Games Technology, 2018; 2018(1): 8734540.
- Daudelin-Peltier, C., Forget, H., Blais, C., Deschênes, A., & Fiset, D. (2017). The effect of acute social stress on the recognition of facial expression of emotions. Scientific reports, 7(1), 1036. https://doi.org/10.1038/s41598-017-01053-3
- Ekman and W. Friesen, Facial action coding system (FACS): manual,1978.
- Viola, P., Jones, M.J. Robust Real-Time Face Detection. International Journal of Computer Vision57, 137–154 (2004). https://doi.org/10.1023/B:VISI.0000013087.49260.fb
- Matthews and S. Baker, Active appearance models revisited, International Journal of Computer Vision, vol. 60, no. 2, pp. 135 164, 2004.
- Watson, Contouring: a guide to the analysis and display of spatial data. Elsevier. 2013, vol. 10.
- Baltru saitis, M. Mahmoud, and P. Robinson, Cross- dataset learning and person- specific normalization for automatic action unit detection, in Automatic Face and Gesture Recognition (FG), 201511th IEEE International Conference and Workshops on, vol.6. IEEE, 2015, pp. 1–6.
- F. Cootes and C. J. Taylor, Active shape models smart snakes, in BMVC 92. Springer, 1992, pp. 266–275.
- Carcagn` I, M. Del Coco, M. Leo, and C. Distante, Facial expression recognition and histograms of oriented gradients: a comprehensive study,‖ Springer Plus, vol. 4, no. 1, p. 645, 2015.
- Smola and B. Scholkopf, A tutorial on support vector regression, Statistics and Computing, vol 14, no. 3, pp.199–222, 2004.
- Lucey, J. F. Cohn, K. M. Prkachin, P. E. Solomon, and I. Matthews, ―Painful data: The unbc-mcmaster shoulder pain expression archive database, in Automatic Face Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on. IEEE, 2011, pp. 57–64.
- Savran, Alyuz, H. Dibeklio˘ glu, O. C¸ eliktutan, B. Gokberk, B. Sankur, and L. Akarun, Bosphorus database for 3d face analysis, in European Workshop on Biometrics and Identity Management. Springer, 2008, pp. 47–56.
- Ding and H. Peng, Minimum redundancy feature selection from microarray gene expression data, Journal of bioinformatics and computational biology, vol. 3, no. 02, pp. 185–205, 2005.
- Gulgezen, Z. Cataltepe, and L. Yu, Stable and accurate feature selection, in Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, 2009, pp. 455–468.
- Andreu, F. Chiarugi, S. Colantonio, G. Giannakakis, D. Giorgi, P. Henriquez, E. Kazantzaki, D. Manousos, K. Marias, B. J. Ma tuszewski, M. A. Pascali, M. Pediaditis, G. Raccichini, and M. Tsik nakis, Wize mirror- a smart, multisensory cardio-metabolic risk monitoring system, Computer Vision and Image Understanding, vol. 148, pp. 3–22, 2016.
- Giannakakis, D. Manousos, V. Chaniotakis, and M. Tsiknakis, Evaluation of head pose features for stress detection and classification, in 2018 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), 2018, pp. 406–409.
- Anis, H. Zakia, D. Mohamed, and C. Jeffrey, ―Detecting depression severity by interpretable representations of motion dynamics, in 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG2018), 2018, pp. 739–745.