Lower beta: a central coordinator of temporal prediction in multimodal speech

Emmanuel Biau*, Sonja A. Kotz

*Corresponding author for this work

Research output: Contribution to journal(Systematic) Review article peer-review

Abstract

How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (similar to 20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1-3 Hz and 4-7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing.

Original languageEnglish
Article number434
JournalFrontiers in Human Neuroscience
Volume12
DOIs
Publication statusPublished - 24 Oct 2018

Keywords

  • temporal predictions
  • beta oscillations
  • multimodal speech perception
  • prosody
  • biological motion
  • HUMAN AUDITORY-CORTEX
  • AUDIOVISUAL SPEECH
  • NEURONAL OSCILLATIONS
  • BIOLOGICAL MOTION
  • BAND OSCILLATIONS
  • VISUAL SPEECH
  • CORTICAL OSCILLATIONS
  • NEURAL OSCILLATIONS
  • PREMOTOR CORTEX
  • HAND GESTURES

Cite this