Recognizing induced emotions of movie audiences: Are induced and perceived emotions the same?

Authors: Tian,, L., Muszynski,, M., Lai,, C., Moore,, J., Kostoulas, T., Lombardo,, P., Pun,, T. and Chanel, G.

http://eprints.bournemouth.ac.uk/29836/

Start date: 23 October 2017

This data was imported from DBLP:

Authors: Tian, L., Muszynski, M., Lai, C., Moore, J.D., Kostoulas, T., Lombardo, P., Pun, T. and Chanel, G.

http://eprints.bournemouth.ac.uk/29836/

http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=8263545

Journal: ACII

Pages: 28-35

Publisher: IEEE Computer Society

ISBN: 978-1-5386-0563-9

DOI: 10.1109/ACII.2017.8273575

This data was imported from Scopus:

Authors: Tian, L., Muszynski, M., Lai, C., Moore, J.D., Kostoulas, T., Lombardo, P., Pun, T. and Chanel, G.

http://eprints.bournemouth.ac.uk/29836/

Journal: 2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017

Volume: 2018-January

Pages: 28-35

ISBN: 9781538605639

DOI: 10.1109/ACII.2017.8273575

© 2017 IEEE. Predicting the emotional response of movie audiences to affective movie content is a challenging task in affective computing. Previous work has focused on using audiovisual movie content to predict movie induced emotions. However, the relationship between the audience's perceptions of the affective movie content (perceived emotions) and the emotions evoked in the audience (induced emotions) remains unexplored. In this work, we address the relationship between perceived and induced emotions in movies, and identify features and modelling approaches effective for predicting movie induced emotions. First, we extend the LIRIS-ACCEDE database by annotating perceived emotions in a crowd-sourced manner, and find that perceived and induced emotions are not always consistent. Second, we show that dialogue events and aesthetic highlights are effective predictors of movie induced emotions. In addition to movie based features, we also study physiological and behavioural measurements of audiences. Our experiments show that induced emotion recognition can benefit from including temporal context and from including multimodal information. Our study bridges the gap between affective content analysis and induced emotion prediction.

This data was imported from Web of Science (Lite):

Authors: Tian, L., Muszynski, M., Lai, C., Moore, J.D., Kostoulas, T., Lombardo, P., Pun, T., Chanel, G. and IEEE

http://eprints.bournemouth.ac.uk/29836/

Journal: 2017 SEVENTH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII)

Pages: 28-35

ISSN: 2156-8103

The data on this page was last updated at 05:02 on May 25, 2019.