Using facial gestures to drive narrative in VR

Authors: Fatoorechi, M., Archer, J., Nduka, C., Seiss, E., Balaguer-Ballester, E., Mavridou, I., Hamedi, M. and Cleal, A.

Journal: SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction

Pages: 152

DOI: 10.1145/3131277.3134366

Abstract:

We developed an exploratory VR environment, where spatial features and narratives can be manipulated in real time by the facial and head gestures of the user. We are using the Faceteq prototype, exhibited in 2017, as the interactive interface. Faceteq consists of a wearable technology that can be adjusted on commercial HMDs for measuring facial expressions and biometric responses. Faceteq project was founded with the aim to provide a human-centred additional tool for affective human-computer interaction. The proposed demo will exhibit the hardware and the functionality of the demo in real time.

https://eprints.bournemouth.ac.uk/29887/

Source: Scopus

Using Facial Gestures to Drive Narrative in VR

Authors: Mavridou, I., Hamedi, M., Fatoorechi, M., Archer, J., Cleal, A., Balaguer-Ballester, E., Seiss, E. and Nduka, C.

Journal: SUI'17: PROCEEDINGS OF THE 2017 SYMPOSIUM ON SPATIAL USER INTERACTION

Pages: 152

DOI: 10.1145/3131277.3134366

https://eprints.bournemouth.ac.uk/29887/

Source: Web of Science (Lite)

Using Facial Gestures to Drive Narrative in VR

Authors: Mavridou, Hamedi, M., Fatoorechi, M., Archer, J., Cleal, A., Balaguer-Ballester, E., Seiss, E. and Nduka, C.

Conference: ACM Spatial User Interfaces (SUI'17)

Dates: 16-17 October 2017

Journal: SUI ’17, Brighton, United Kingdom, October 16–17, 20

Pages: 152

Publisher: ACM

Place of Publication: ACM Digital library

DOI: 10.1145/3131277.3134366

Abstract:

We developed an exploratory VR environment, where spatial features and narratives can be manipulated in real time by the facial and head gestures of the user. We are using the Faceteq prototype, exhibited in 2017, as the interactive interface. Faceteq consists of a wearable technology that can be adjusted on commercial HMDs for measuring facial expressions and biometric responses. Faceteq project was founded with the aim to provide a human-centred additional tool for affective human-computer interaction. The proposed demo will exhibit the hardware and the functionality of the demo in real time.

https://eprints.bournemouth.ac.uk/29887/

https://dl.acm.org/citation.cfm?id=3131277

Source: Manual

Using Facial Gestures to Drive Narrative in VR

Authors: Mavridou, I., Hamedi, M., Fatoorechi, M., Archer, J., Cleal, A., Balaguer-Ballester, E., Seiss, E. and Nduka, C.

Conference: ACM Spatial User Interfaces (SUI'17)

Pages: 152

Publisher: ACM

Abstract:

We developed an exploratory VR environment, where spatial features and narratives can be manipulated in real time by the facial and head gestures of the user. We are using the Faceteq prototype, exhibited in 2017, as the interactive interface. Faceteq consists of a wearable technology that can be adjusted on commercial HMDs for measuring facial expressions and biometric responses. Faceteq project was founded with the aim to provide a human-centred additional tool for affective human-computer interaction. The proposed demo will exhibit the hardware and the functionality of the demo in real time.

https://eprints.bournemouth.ac.uk/29887/

https://dl.acm.org/citation.cfm?id=3131277

Source: BURO EPrints