Sad people are more accurate at expression identification with a smaller own-ethnicity bias than happy people

This source preferred by Peter Arabaci Hills

Authors: Hills, P.J. and Hill, D.M.

http://eprints.bournemouth.ac.uk/29493/

Journal: Q J Exp Psychol (Hove)

Pages: 1-30

eISSN: 1747-0226

DOI: 10.1080/17470218.2017.1350869

Sad individuals perform more accurately at face identity recognition (Hills, Werno, & Lewis, 2011), possibly because they scan more of the face during encoding. During expression identification tasks, sad individuals do not fixate on the eyes as much as happier individuals (Wu, Pu, Allen, & Pauli, 2012). Fixating on features other than the eyes leads to a reduced own-ethnicity bias (Hills & Lewis, 2006). This background indicates that sad individuals would not view the eyes as much as happy individuals and this would result in improved expression recognition and a reduced own-ethnicity bias. This prediction was tested using an expression identification task, with eye tracking. We demonstrate that sad-induced participants show enhanced expression recognition and a reduced own-ethnicity bias than happy-induced participants due to scanning more facial features. We conclude that mood affects eye movements and face encoding by causing a wider sampling strategy and deeper encoding of facial features diagnostic for expression identification.

This data was imported from PubMed:

Authors: Hills, P.J. and Hill, D.M.

http://eprints.bournemouth.ac.uk/29493/

Journal: Q J Exp Psychol (Hove)

Volume: 71

Issue: 8

Pages: 1797-1806

eISSN: 1747-0226

DOI: 10.1080/17470218.2017.1350869

Sad individuals are more accurate at face identity recognition, possibly because they scan more of the face during encoding. During expression identification tasks, sad individuals do not fixate on the eyes as much as happier individuals. Fixating on features other than the eyes leads to a reduced own-ethnicity bias. This background indicates that sad individuals would not view the eyes as much as happy individuals, and this would result in improved expression recognition and reduced own-ethnicity bias. This prediction was tested using an expression identification task with eye tracking. We demonstrate that sad-induced participants show enhanced expression recognition and a reduced own-ethnicity bias than happy-induced participants due to scanning more facial features. We conclude that mood affects eye movements and face encoding by causing a wider sampling strategy and deeper encoding of facial features diagnostic for expression identification.

This data was imported from Scopus:

Authors: Hills, P.J. and Hill, D.M.

http://eprints.bournemouth.ac.uk/29493/

Journal: Quarterly Journal of Experimental Psychology

Volume: 71

Issue: 8

Pages: 1797-1806

eISSN: 1747-0226

ISSN: 1747-0218

DOI: 10.1080/17470218.2017.1350869

© Experimental Psychology Society 2017. Sad individuals are more accurate at face identity recognition, possibly because they scan more of the face during encoding. During expression identification tasks, sad individuals do not fixate on the eyes as much as happier individuals. Fixating on features other than the eyes leads to a reduced own-ethnicity bias. This background indicates that sad individuals would not view the eyes as much as happy individuals, and this would result in improved expression recognition and reduced own-ethnicity bias. This prediction was tested using an expression identification task with eye tracking. We demonstrate that sad-induced participants show enhanced expression recognition and a reduced own-ethnicity bias than happy-induced participants due to scanning more facial features. We conclude that mood affects eye movements and face encoding by causing a wider sampling strategy and deeper encoding of facial features diagnostic for expression identification.

This data was imported from Web of Science (Lite):

Authors: Hills, P.J. and Hill, D.M.

http://eprints.bournemouth.ac.uk/29493/

Journal: QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY

Volume: 71

Issue: 8

Pages: 1797-1806

eISSN: 1747-0226

ISSN: 1747-0218

DOI: 10.1080/17470218.2017.1350869

The data on this page was last updated at 04:57 on June 24, 2019.