Current work focuses on the detection of human behavior emotional cues and their incorporation into affect aware Natural Interaction. Techniques for extracting emotional cues based on visual non verbal human behavior are presented. Namely, gesture qualitative expressivity features and head pose and eye gaze estimation are derived from hand and facial movement respectively. Extracted emotional cues are employed in expressive synthesis on virtual agents, based on the analysis of actions performed by human users, in a Human-Virtual Agent Interaction setting and in Assistive Technologies aiming to infer in real time the degree of attention or frustration of children with reading difficulties. © 2011 IEEE.
|Title of host publication||17th DSP 2011 International Conference on Digital Signal Processing, Proceedings|
|Publication status||Published - 2011|
- Affective computing
- Eye Gaze
- Gesture expressivity
- Natural Interaction