The PlayMancer database: A multimodal affect database in support of research and development activities in serious game environment
This data was imported from DBLP:
Authors: Kostoulas, T., Kocsis, O., Ganchev, T., Fernández-Aranda, F., Santamaría, J.J., Jiménez-Murcia, S., Moussa, M.B., Magnenat-Thalmann, N. and Fakotakis, N.
Editors: Calzolari, N., Choukri, K., Maegaard, B., Mariani, J., Odijk, J., Piperidis, S., Rosner, M. and Tapias, D.
Publisher: European Language Resources Association
This data was imported from Scopus:
Authors: Kostoulas, T., Kocsis, O., Ganchev, T., Fernández-Aranda, F., Santamaría, J.J., Jiménez-Murcia, S., Ben Moussa, M., Magnenat-Thalmann, N. and Fakotakis, N.
Journal: Proceedings of the 7th International Conference on Language Resources and Evaluation, LREC 2010
The present paper reports on a recent effort that resulted in the establishment of a unique multimodal affect database, referred to as the PlayMancer database. This database was created in support of the research and development activities, taking place within the PlayMancer project, which aim at the development of a serious game environment in support of treatment of patients with behavioural and addictive disorders, such as eating disorders and gambling addictions. Specifically, for the purpose of data collection, we designed and implemented a pilot trial with healthy test subjects. Speech, video and bio-signals (pulse-rate, SpO 2 ) were captured synchronously, during the interaction of healthy people with a number of video games. The collected data were annotated by the test subjects (self-annotation), targeting proper interpretation of the underlying affective states. The broad-shouldered design of the PlayMancer database allows its use for the needs of research on multimodal affect-emotion recognition and multimodal human-computer interaction in serious games environment.