Decoding sounds depicting hand-object interactions in primary somatosensory cortex

Kerri M Bailey, Bruno L Giordano, Amanda L. Kaas, Fraser W Smith*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Neurons, even in the earliest sensory regions of cortex, are subject to a great deal of contextual influences from both within and across modality connections. Recent work has shown that primary sensory areas can respond to and, in some cases, discriminate stimuli that are not of their target modality: for example, primary somatosensory cortex (SI) discriminates visual images of graspable objects. In the present work, we investigated whether SI would discriminate sounds depicting hand-object interactions (e.g. bouncing a ball). In a rapid event-related functional magnetic resonance imaging experiment, participants listened attentively to sounds from 3 categories: hand-object interactions, and control categories of pure tones and animal vocalizations, while performing a one-back repetition detection task. Multivoxel pattern analysis revealed significant decoding of hand-object interaction sounds within SI, but not for either control category. Crucially, in the hand-sensitive voxels defined from an independent tactile localizer, decoding accuracies were significantly higher for hand-object interactions compared to pure tones in left SI. Our findings indicate that simply hearing sounds depicting familiar hand-object interactions elicit different patterns of activity in SI, despite the complete absence of tactile stimulation. These results highlight the rich contextual information that can be transmitted across sensory modalities even to primary sensory areas.

Original languageEnglish
Pages (from-to)3621–3635
Number of pages15
JournalCerebral Cortex
Volume33
Issue number7
Early online date31 Aug 2022
DOIs
Publication statusPublished - Apr 2023

Cite this