Modality-specific brain representations during automatic processing of face, voice and body expressions

Maarten Vaessen, Kiki van der Heijden, Beatrice de Gelder*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
Original languageEnglish
Article number1132088
Number of pages13
JournalFrontiers in Neuroscience
Volume17
DOIs
Publication statusPublished - 6 Oct 2023

Keywords

  • multisensory affect
  • faces
  • voices
  • bodies
  • emotion perception
  • facial expressions
  • voice
  • EMOTIONAL EXPRESSIONS
  • PATTERN-ANALYSIS
  • PERCEPTION
  • COMMON
  • INTEGRATION
  • RESPONSES
  • CORTEX
  • SCENES

Fingerprint

Dive into the research topics of 'Modality-specific brain representations during automatic processing of face, voice and body expressions'. Together they form a unique fingerprint.

Cite this