Dissociating task difficulty from incongruence in face-voice emotion integration

R. Watson*, M. Latinus, T. Noguchi, O. Garrod, F. Crabbe, P. Belin

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

In the everyday environment, affective information is conveyed by both the face and the voice. Studies have demonstrated that a concurrently presented voice can alter the way that an emotional face expression is perceived, and vice versa, leading to emotional conflict if the information in the two modalities is mismatched. Additionally, evidence suggests that incongruence of emotional valence activates cerebral networks involved in conflict monitoring and resolution. However, it is currently unclear whether this is due to task difficulty-that incongruent stimuli are harder to categorize-or simply to the detection of mismatching information in the two modalities. The aim of the present fMRI study was to examine the neurophysiological correlates of processing incongruent emotional information, independent of task difficulty. Subjects were scanned while judging the emotion of face-voice affective stimuli. Both the face and voice were parametrically morphed between anger and happiness and then paired in all audiovisual combinations, resulting in stimuli each defined by two separate values: the degree of incongruence between the face and voice, and the degree of clarity of the combined face-voice information. Due to the specific morphing procedure utilized, we hypothesized that the clarity value, rather than incongruence value, would better reflect task difficulty. Behavioral data revealed that participants integrated face and voice affective information, and that the clarity, as opposed to incongruence value correlated with categorization difficulty. Cerebrally, incongruence was more associated with activity in the superior temporal region, which emerged after task difficulty had been accounted for. Overall, our results suggest that activation in the superior temporal region in response to incongruent information cannot be explained simply by task difficulty, and may rather be due to detection of mismatching information between the two modalities.
Original languageEnglish
Article number744
Number of pages13
JournalFrontiers in Human Neuroscience
Volume7
DOIs
Publication statusPublished - 13 Nov 2013

Keywords

  • ANTERIOR CINGULATE CORTEX
  • AUDIOVISUAL INTEGRATION
  • BRAIN ACTIVITY
  • CARRY-OVER
  • CONFLICT
  • CROSSMODAL BINDING
  • EVENT-RELATED FMRI
  • MULTISENSORY INTEGRATION
  • SELECTIVE ATTENTION
  • SPEECH-PERCEPTION
  • affective conflict
  • emotion perception
  • functiona lmagnetic resonance adaptation
  • incongruence
  • multisensory integration

Cite this