The Brain Dynamics of Rapid Perceptual Adaptation to Adverse Listening Conditions

Julia Erb, Molly J. Henry, Frank Eisner, Jonas Obleser*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Listeners show a remarkable ability to quickly adjust to degraded speech input. Here, we aimed to identify the neural mechanisms of such short-term perceptual adaptation. In a sparse-sampling, cardiac-gated functional magnetic resonance imaging (fMRI) acquisition, human listeners heard and repeated back 4-band-vocoded sentences (in which the temporal envelope of the acoustic signal is preserved, while spectral information is highly degraded). Clear-speech trials were included as baseline. An additional fMRI experiment on amplitude modulation rate discrimination quantified the convergence of neural mechanisms that subserve coping with challenging listening conditions for speech and non-speech. First, the degraded speech task revealed an "executive" network (comprising the anterior insula and anterior cingulate cortex), parts of which were also activated in the non-speech discrimination task. Second, trial-by-trial fluctuations in successful comprehension of degraded speech drove hemodynamic signal change in classic "language" areas (bilateral temporal cortices). Third, as listeners perceptually adapted to degraded speech, downregulation in a cortico-striato-thalamo-cortical circuit was observable. The present data highlight differential upregulation and downregulation in auditory-language and executive networks, respectively, with important subcortical contributions when successfully adapting to a challenging listening situation.
Original languageEnglish
Pages (from-to)10688-10697
JournalJournal of Neuroscience
Volume33
Issue number26
DOIs
Publication statusPublished - 26 Jun 2013
Externally publishedYes

Fingerprint

Dive into the research topics of 'The Brain Dynamics of Rapid Perceptual Adaptation to Adverse Listening Conditions'. Together they form a unique fingerprint.

Cite this