Beat gestures and syntactic parsing: an ERP study

Emmanuel Biau*, Lauren A. Fromont, Salvador Soto-Faraco

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

6 Citations (Web of Science)

Abstract

We tested the prosodic hypothesis that the temporal alignment of a speaker's beat gestures in a sentence influences syntactic parsing by driving the listener's attention. Participants chose between two possible interpretations of relative-clause (RC) ambiguous sentences, while their electroencephalogram (EEG) was recorded. We manipulated the alignment of the beat within sentences where auditory prosody was removed. Behavioral performance showed no effect of beat placement on the sentences' interpretation, while event-related potentials (ERPs) revealed a positive shift of the signal in the windows corresponding to N100 and P200 components. Additionally, post hoc analyses of the ERPs time locked to the RC revealed a modulation of the P600 component as a function of gesture. These results suggest that beats modulate early processing of affiliate words in continuous speech and potentially have a global impact at the level of sentence-parsing components. We speculate that beats must be synergistic with auditory prosody to be fully consequential in behavior.

Original languageEnglish
Pages (from-to)102-126
Number of pages25
JournalLanguage Learning
Volume68
Issue numberS1
DOIs
Publication statusPublished - Jun 2018

Keywords

  • audiovisual speech
  • gestures
  • prosody
  • syntactic parsing
  • ERPs
  • P600
  • EVENT-RELATED POTENTIALS
  • SPEECH-PERCEPTION
  • SELECTIVE ATTENTION
  • HUMAN BRAIN
  • AUDIOVISUAL INTEGRATION
  • LANGUAGE COMPREHENSION
  • RELATIVE CLAUSES
  • WORD RECOGNITION
  • AUDITORY SPEECH
  • VISUAL PROSODY

Cite this