Towards valence detection from EMG for Virtual Reality applications
Authors: Mavridou, I., Seiss, E., Hamedi, M., Balaguer-Ballester, E.M.I.L.I. and Nduka, C.
Publisher: ICDVRAT
ISBN: 978-0-7049-1548-0
Abstract:The current practical restraints for facial expression recognition in Virtual Reality (VR) led to the development of a novel wearable interface called Faceteq. Our team designed a pilot feasibility study to explore the effect of spontaneous facial expressions on eight EMG sensors, incorporated on the Faceteq interface. Thirty-four participants took part in the study where they watched a sequence of video stimuli while self-rating their emotional state. After a specifically designed signal pre-processing, we aimed to classify the responses into three classes (negative, neutral, positive). A C-SVM classifier was cross-validated for each participant, reaching an out-of-sample average accuracy of 82.5%. These preliminary results have encouraged us to enlarge our dataset and incorporate data from different physiological signals to achieve automatic detection of combined arousal and valence states for VR applications.
https://eprints.bournemouth.ac.uk/31022/
Source: Manual
Towards valence detection from EMG for Virtual Reality applications
Authors: Mavridou, I., Seiss, E., Hamedi, M., Balaguer-Ballester, E. and Nduka, C.
Publisher: ICDVRAT, University of Reading
ISBN: 978-0-7049-1548-0
Abstract:The current practical restraints for facial expression recognition in Virtual Reality (VR) led to the development of a novel wearable interface called Faceteq. Our team designed a pilot feasibility study to explore the effect of spontaneous facial expressions on eight EMG sensors, incorporated on the Faceteq interface. Thirty-four participants took part in the study where they watched a sequence of video stimuli while self-rating their emotional state. After a specifically designed signal pre-processing, we aimed to classify the responses into three classes (negative, neutral, positive). A C-SVM classifier was cross-validated for each participant, reaching an out-of-sample average accuracy of 82.5%. These preliminary results have encouraged us to enlarge our dataset and incorporate data from different physiological signals to achieve automatic detection of combined arousal and valence states for VR applications.
https://eprints.bournemouth.ac.uk/31022/
Source: BURO EPrints