Abstract
Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.
Original language | English |
---|---|
Article number | 119375 |
Number of pages | 11 |
Journal | Neuroimage |
Volume | 258 |
Early online date | 11 Jun 2022 |
DOIs | |
Publication status | Published - Sept 2022 |
Keywords
- AUDIOVISUAL SPEECH
- Auditory
- CONNECTIONS
- CORTEX
- DISCRIMINATION
- Dichotic listening
- FEATURES
- HEARING LIPS
- MVPA
- NEURAL REPRESENTATIONS
- PERCEPTION
- PRE-SMA
- SUPPLEMENTARY MOTOR AREA
- Speech perception
- fMRI