Modulating Cortical Instrument Representations During Auditory Stream Segregation and Integration With Polyphonic Music

Lars Hausfeld*, Niels R. Disbergen, Giancarlo Valente, Robert J. Zatorre, Elia Formisano

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review


Numerous neuroimaging studies demonstrated that the auditory cortex tracks ongoing speech and that, in multi-speaker environments, tracking of the attended speaker is enhanced compared to the other irrelevant speakers. In contrast to speech, multi-instrument music can be appreciated by attending not only on its individual entities (i.e., segregation) but also on multiple instruments simultaneously (i.e., integration). We investigated the neural correlates of these two modes of music listening using electroencephalography (EEG) and sound envelope tracking. To this end, we presented uniquely composed music pieces played by two instruments, a bassoon and a cello, in combination with a previously validated music auditory scene analysis behavioral paradigm (Disbergen et al., 2018). Similar to results obtained through selective listening tasks for speech, relevant instruments could be reconstructed better than irrelevant ones during the segregation task. A delay-specific analysis showed higher reconstruction for the relevant instrument during a middle-latency window for both the bassoon and cello and during a late window for the bassoon. During the integration task, we did not observe significant attentional modulation when reconstructing the overall music envelope. Subsequent analyses indicated that this null result might be due to the heterogeneous strategies listeners employ during the integration task. Overall, our results suggest that subsequent to a common processing stage, top-down modulations consistently enhance the relevant instrument's representation during an instrument segregation task, whereas such an enhancement is not observed during an instrument integration task. These findings extend previous results from speech tracking to the tracking of multi-instrument music and, furthermore, inform current theories on polyphonic music perception.

Original languageEnglish
Article number635937
Number of pages15
JournalFrontiers in Neuroscience
Publication statusPublished - 24 Sept 2021


  • EEG
  • TIME
  • attention
  • auditory scene analysis
  • auditory stream segregation
  • envelope tracking
  • polyphonic music

Cite this