March 30th: Yifei He

The neural basis of gesture-speech integration and interaction during online comprehension

Yifei He, Translational Neuroimaging Lab, Department of Psychiatry and Psychotherapy, Philipps University Marburg

Tuesday, Mar 30 2021, 11:00-12:00 BST
Zoom Details: [Please Request]

Human daily communication is realized in a multimodal manner. Besides auditory speech, visual input such as hand gestures also plays an important role. Despite advances in neuroscientific investigations on language processing, we know relatively little about the neural basis of how gesture integrates and interacts with speech during online comprehension. In this talk, I will present evidence from EEG, fMRI, and simultaneous EEG-fMRI, showing the brain dynamics of how the two input channels are integrated as coherent semantic representations. I will also present studies on how gesture impacts the semantic processing of language (auditory speech & visual sentences). i) EEG evidence from a controlled experiment shows that the social aspects of gesture (body orientation) may directly influence the N400 amplitude during sentence processing. ii) fMRI studies employing naturalistic paradigms suggest that gesture facilitates the neural processing of natural speech; more specifically, this facilitation may be realized as reduced brain activation on semantic prediction but not prediction error on a single word level.