Introducing the EmteqVR Interface for Affect Detection in Virtual Reality

This data was imported from DBLP:

Authors: Mavridou, I., Seiss, E., Kostoulas, T., Hamedi, M., Balaguer-Ballester, E. and Nduka, C.

https://ieeexplore.ieee.org/xpl/conhome/8911254/proceeding

Journal: ACII Workshops

Pages: 83-84

Publisher: IEEE

ISBN: 978-1-7281-3891-6

This data was imported from Scopus:

Authors: Mavridou, I., Seiss, E., Kostoulas, T., Hamedi, M., Balaguer-Ballester, E. and Nduka, C.

Journal: 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2019

Pages: 83-84

ISBN: 9781728138916

DOI: 10.1109/ACIIW.2019.8925297

© 2019 IEEE. This paper introduces the wearable technology EmteqVR ™ which can detect facial expressions and physiological responses from the user's facial area whilst the user is wearing a commercial Virtual Reality head mounted display (HMD). This EmteqVR interface, an evolution of the earlier prototype called 'Faceteq', comprises nine biometric sensors including f-EMG, PPG and IMU, enabling it to detect the affective state of the user in real time. This newly developed technology can revolutionize the way we collect data, design experiences and interact within Virtual, Mixed and Augmented Realities. In addition, this novel approach could assist in healthcare interventions and future experimental studies. Our team developed a Virtual Reality experience specifically designed to induce various emotional responses to users. We will demonstrate how the current interface and a custom expression detection algorithm are used in real-time to provide feedback on the user's affective state within Virtual Reality.

The data on this page was last updated at 05:16 on February 19, 2020.