The multisensory perception of co-speech gestures - A review and meta-analysis of neuroimaging studies

Authors: Marstaller, L. and Burianová, H.

Journal: Journal of Neurolinguistics

Volume: 30

Issue: 1

Pages: 69-77

eISSN: 1873-8052

ISSN: 0911-6044

DOI: 10.1016/j.jneuroling.2014.04.003

Abstract:

Co-speech gestures constitute a unique form of multimodal communication because here the hand movements are temporally synchronized and semantically integrated with speech. Recent neuroimaging studies indicate that the perception of co-speech gestures might engage a core set of frontal, temporal, and parietal areas. However, no study has compared the neural processes during perception of different types of co-speech gestures, such as beat, deictic, iconic, and metaphoric co-speech gestures. The purpose of this study was to review the existing literature on the neural correlates of co-speech gesture perception and to test whether different types of co-speech gestures elicit a common pattern of brain activity in the listener. To this purpose, we conducted a meta-analysis of neuroimaging studies, which used different types of co-speech gestures to investigate the perception of multimodal (co-speech gestures) in contrast to unimodal (speech or gestures) stimuli. The results show that co-speech gesture perception consistently engages temporal regions related to auditory and movement perception as well as frontal-parietal regions associated with action understanding. The results of this study suggest that brain regions involved in multisensory processing and action understanding constitute the general core of co-speech gesture perception. © 2014.

Source: Scopus

The multisensory perception of co-speech gestures - A review and meta-analysis of neuroimaging studies

Authors: Marstaller, L. and Burianova, H.

Journal: JOURNAL OF NEUROLINGUISTICS

Volume: 30

Pages: 69-77

ISSN: 0911-6044

DOI: 10.1016/j.jneuroling.2014.04.003

Source: Web of Science (Lite)