Investigating human audio-visual object perception with a combination of hypothesis-generating and hypothesis-testing fMRI analysis tools
Research output: Contribution to journal › Article › Academic › peer-review
Primate multisensory object perception involves distributed brain regions. To investigate the network character of these regions of the human brain, we applied data-driven group spatial independent component analysis (ICA) to a functional magnetic resonance imaging (fMRI) data set acquired during a passive audio-visual (AV) experiment with common object stimuli. We labeled three group-level independent component (IC) maps as auditory (A), visual (V), and AV, based on their spatial layouts and activation time courses. The overlap between these IC maps served as definition of a distributed network of multisensory candidate regions including superior temporal, ventral occipito-temporal, posterior parietal and prefrontal regions. During an independent second fMRI experiment, we explicitly tested their involvement in AV integration. Activations in nine out of these twelve regions met the max-criterion (A <AV > V) for multisensory integration. Comparison of this approach with a general linear model-based region-of-interest definition revealed its complementary value for multisensory neuroimaging. In conclusion, we estimated functional networks of uni- and multisensory functional connectivity from one dataset and validated their functional roles in an independent dataset. These findings demonstrate the particular value of ICA for multisensory neuroimaging research and using independent datasets to test hypotheses generated from a data-driven analysis.
- Crossmodal, Functional connectivity, Functional magnetic resonance imaging, Independent component analysis, Multisensory, Object perception, INDEPENDENT COMPONENT ANALYSIS, SUPERIOR TEMPORAL SULCUS, FUNCTIONAL CONNECTIVITY, MULTISENSORY INTEGRATION, SENSORY CORTICES, HUMAN BRAIN, VISUAL INFORMATION, BLIND SEPARATION, SPEECH, CORTEX