Eye-Tracking as an Indicator of Depressive Emotional States in Virtual Reality: An Explainable Artificial Intelligence Approach
Abstract views: 1 / PDF downloads: 1
DOI:
https://doi.org/10.5281/zenodo.14188690Keywords:
Emotion Recognition, Eye-Tracking, Explainable Artificial Intelligence, Shap, InterpretabilityAbstract
This study explores the intricate relationship between eye-tracking metrics and emotional states,
with a particular focus on depressive emotions, including sadness, depression, and boredom. Leveraging
the VR Eyes: Emotions Dataset (VREED), a publicly available dataset that captures eye-tracking data from
participants immersed in 360-degree virtual environments, we examined key eye-movement features such
as saccades, micro-saccades, fixations, and blinks. By applying the Circumplex Model of Affect (CMA),
we categorize emotional states along dimensions of arousal and valence, facilitating a nuanced analysis of
affective responses. An Extra Trees Classifier was employed as our primary machine learning model to
predict emotional states based on eye-tracking metrics, and we used Explainable Artificial Intelligence
(XAI) techniques to interpret the model's decisions. These XAI techniques, such as SHAP, reveal the
individual contributions of each feature, highlighting the critical role of micro-saccades and fixations as
predictors of depressive states. Our findings suggest that eye-tracking metrics may serve as objective
indicators of emotional experiences. This research underscores the potential of integrating eye-tracking data
and machine learning within virtual environments as a valuable approach for advancing emotional
assessment in mental health contexts.
Downloads
References
Rodriguez, L.F.; Ramos, F. “Development of computational models of emotions for autonomous agents: A review”. Cogn. Comput. 6, 351–375, 2014.
Xu, H.; Plataniotis, K.N. “Affect recognition using EEG signal”. In Proceedings of the 2012 IEEE 14th InternationalWorkshop on Multimedia Signal Processing (MMSP), Banff, AB, Canada, 17–19 September 2012 ; IEEE: Piscataway, NJ, USA, pp. 299–304, 2012.
Hermanis, A.; Cacurs, R.; Nesenbergs, K.; Greitans, M.; Syundyukov, E.; Selavo, L. “Wearable Sensor System for Human Biomechanics Monitoring”. In Proceedings of the EWSN, pp. 247–248, 2016.
Chen, L.l.; Zhao, Y.; Ye, P.f.; Zhang, J.; Zou, J.z. “Detecting driving stress in physiological signals based on multimodal feature analysis and kernel classifiers”. “Expert Syst. Appl.” 85, 279–291, 2017.
Krithika, L.; Venkatesh, K.; Rathore, S.; Kumar, M.H. “Facial recognition in education system”. In Proceedings of the IOP
Conference Series: Materials Science and Engineering, IOP Publishing: Bristol, UK, 2017; Volume 263, p. 042021, 2017.
Yadava, M.; Kumar, P.; Saini, R.; Roy, P.P.; Prosad Dogra, D. “Analysis of EEG signals and its application to neuromarketing”. Multimed. Tools Appl., 76, 19087–19111, 2017.
Armstrong, T., & Olatunji, B. O. “Eye tracking of attention in the affective disorders: A meta-analytic review and synthesis”. Clinical psychology review, 32(8), 704-723, 2012.
Gao, M., Xin, R., Wang, Q., Gao, D., Wang, J., & Yu, Y. “Abnormal eye movement features in patients with depression: Preliminary findings based on eye tracking technology”. General Hospital Psychiatry, 84, 25-30, 2023.
Zhang, D., Liu, X., Xu, L., Li, Y., Xu, Y., Xia, M., ... & Wang, J. “Effective differentiation between depressed patients and controls using discriminative eye movement features”. Journal of Affective Disorders, 307, 237-243, 2022.
Li, Y., Xu, Y., Xia, M., Zhang, T., Wang, J., Liu, X., et al. “Eye movement indices in the study of depressive disorder”. Shanghai Archives of Psychiatry 28, 326, 2016.
Ahmed, I.; Jeon, G.; Piccialli, F. “From artificial intelligence to explainable artificial intelligence in industry 4.0: A survey on what, how, and where”. IEEE Trans. Ind. Inform. 18, 5031–5042, 2022.
Lundberg, S.M.; Lee, S.I. A unified approach to interpreting model predictions. Adv. Neural Inf. Process. Syst. 30, 1–10, 2017.
Ribeiro, M.T.; Singh, S.; Guestrin, C. “Why should I trust you?” Explaining the predictions of any classifier. In Proceedings of the 22nd ACM Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 1135–1144, 2016.
Arya, R.; Singh, J.; Kumar, A. “A survey of multidisciplinary domains contributing to affective computing”. Comput. Sci. Rev., 40, 100399, 2021.
Sacharin, V.; Schlegel, K.; Scherer, K.R. “Geneva Emotion Wheel Rating Study”; NCCR Affective Sciences ; Center for Person, Kommunikation, Aalborg University: Aalborg, Denmark, 2012.
Russell, J.A. “A circumplex model of affect”. J. Personal. Soc. Psychol., 39, 1161, 1980.
Alexandros, L.; Michalis, X. “The physiological measurements as a critical indicator in users’ experience evaluation”. In Proceedings of the 17th Panhellenic Conference on Informatics, pp. 258–263, 2013.
Somarathna, R.; Bednarz, T.; Mohammadi, G. “Virtual reality for emotion elicitation—A review”. IEEE Trans. Affect. Comput. 14, 2626–2645, 2022.
Tabbaa, L.; Searle, R.; Bafti, S.M.; Hossain, M.M.; Intarasisrisawat, J.; Glancy, M.; Ang, C.S. “Vreed: Virtual reality emotion recognition dataset using eye tracking & physiological measures”. Proc. Acm Interact. Mob. Wearable Ubiquitous Technol. 5, 1–20, 2021.
Chawla, N.V.; Bowyer, K.W.; Hall, L.O.; Kegelmeyer,W.P. “SMOTE: Synthetic minority over-sampling technique.” J. Artif. Intell. Res. 16, 321–357, 2002.