Natural interaction multimodal analysis: Expressivity analysis towards adaptive and personalized interfaces

Stylianos Asteriadis*, George Caridakis, Lori Malatesta, Kostas Karpouzis

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterAcademic

Abstract

Intelligent personalized systems often ignore the affective aspect of human behavior and focus more on tactile cues of the user activity. A complete user modelling, though, should also incorporate cues such as facial expressions, speech prosody and gesture or body posture expressivity features, in order to dynamically profile the user, fusing all available modalities since these qualitative affective cues contain significant information about the user's non verbal behavior and communication. Towards this direction, this work focuses on automatic extraction of gestural and head expressivity features and related statistical processing. The perspective of adopting a common formalization of using expressivity features for a multitude of visual, emotional modalities is explored and grounded through an overview of experiments on appropriate corpora and the corresponding analysis. © 2012 IEEE.
Original languageEnglish
Title of host publicationProceedings - 2012 7th International Workshop on Semantic and Social Media Adaptation and Personalization, SMAP 2012
Pages131-136
Number of pages6
DOIs
Publication statusPublished - 2012
Externally publishedYes

Publication series

SeriesProceedings - 2012 7th International Workshop on Semantic and Social Media Adaptation and Personalization, SMAP 2012

Keywords

  • Activity recognition
  • Emotion estimation
  • Expressivity features
  • Gesture recognition

Cite this