Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration

R. Watson, M. Latinus, T. Noguchi, O. Garrod, F. Crabbe, P. Belin

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The integration of emotional information from the face and voice of other persons is known to be mediated by a number of "multisensory" cerebral regions, such as the right posterior superior temporal sulcus (pSTS). However, whether multimodal integration in these regions is attributable to interleaved populations of unisensory neurons responding to face or voice or rather by multimodal neurons receiving input from the two modalities is not fully clear. Here, we examine this question using functional magnetic resonance adaptation and dynamic audiovisual stimuli in which emotional information was manipulated parametrically and independently in the face and voice via morphing between angry and happy expressions. Healthy human adult subjects were scanned while performing a happy/angry emotion categorization task on a series of such stimuli included in a fast event-related, continuous carryover design. Subjects integrated both face and voice information when categorizing emotion-although there was a greater weighting of face information-and showed behavioral adaptation effects both within and across modality. Adaptation also occurred at the neural level: in addition to modality-specific adaptation in visual and auditory cortices, we observed for the first time a crossmodal adaptation effect. Specifically, fMRI signal in the right pSTS was reduced in response to a stimulus in which facial emotion was similar to the vocal emotion of the preceding stimulus. These results suggest that the integration of emotional information from face and voice in the pSTS involves a detectable proportion of bimodal neurons that combine inputs from visual and auditory cortices.
Original languageEnglish
Pages (from-to)6813-6821
Number of pages9
JournalJournal of Neuroscience
Volume34
Issue number20
DOIs
Publication statusPublished - 14 May 2014

Keywords

  • emotion perception
  • functional magnetic resonance adaptation
  • multisensory integration
  • AUDIOVISUAL INTEGRATION
  • FACIAL EXPRESSIONS
  • BRAIN RESPONSES
  • TIME-COURSE
  • MODAL FACE
  • PERCEPTION
  • ACTIVATION
  • PROSODY
  • SIGNALS
  • CORTEX

Cite this