Neural Correlates of Phonetic Adaptation as Induced by Lexical and Audiovisual Context

Shruti Ullas*, Lars Hausfeld, Anne Cutler, Frank Eisner, Elia Formisano

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

90 Downloads (Pure)

Abstract

When speech perception is difficult, one way listeners adjust is by reconfiguring phoneme category boundaries, drawing on contextual information. Both lexical knowledge and lipreading cues are used in this way, but it remains unknown whether these two differing forms of perceptual learning are similar at a neural level. This study compared phoneme boundary adjustments driven by lexical or audiovisual cues, using ultra-high-field 7-T fMRI. During imaging, participants heard exposure stimuli and test stimuli. Exposure stimuli for lexical retuning were audio recordings of words, and those for audiovisual recalibration were audio-video recordings of lip movements during utterances of pseudowords. Test stimuli were ambiguous phonetic strings presented without context, and listeners reported what phoneme they heard. Reports reflected phoneme biases in preceding exposure blocks (e.g., more reported /p/ after /p/-biased exposure). Analysis of corresponding brain responses indicated that both forms of cue use were associated with a network of activity across the temporal cortex, plus parietal, insula, and motor areas. Audiovisual recalibration also elicited significant occipital cortex activity despite the lack of visual stimuli. Activity levels in several ROIs also covaried with strength of audiovisual recalibration, with greater activity accompanying larger recalibration shifts. Similar activation patterns appeared for lexical retuning, but here, no significant ROIs were identified. Audiovisual and lexical forms of perceptual learning thus induce largely similar brain response patterns. However, audiovisual recalibration involves additional visual cortex contributions, suggesting that previously acquired visual information (on lip movements) is retrieved and deployed to disambiguate auditory perception.

Original languageEnglish
Pages (from-to)2145-2158
Number of pages14
JournalJournal of Cognitive Neuroscience
Volume32
Issue number11
Early online date14 Jul 2020
DOIs
Publication statusPublished - Nov 2020

Keywords

  • ACTIVATION
  • AUDITORY-CORTEX
  • BRAIN
  • HEARING LIPS
  • INFORMATION
  • ORGANIZATION
  • PERCEPTUAL ADAPTATION
  • RECALIBRATION
  • REPRESENTATION
  • SPEECH-PERCEPTION

Cite this