TY - JOUR
T1 - Exploiting sensing devices availability in AR/VR deployments to foster engagement
AU - Vretos, Nicholas
AU - Daras, Petros
AU - Asteriadis, Stylianos
AU - Hortal, Enrique
AU - Ghaleb, Esam
AU - Spyrou, Evaggelos
AU - Leligou, Helen C.
AU - Karkazis, Panagiotis
AU - Trakadas, Panagiotis
AU - Assimakopoulos, Kostantinos
N1 - Funding Information:
The work presented in this document has been partially funded through H2020-MaTHiSiS Project. This project has received funding from the European Union’s Horizon 2020 Programme (H2020-ICT-2015) under Grant Agreement No. 687772.
Publisher Copyright:
© 2018, Springer-Verlag London Ltd., part of Springer Nature.
PY - 2019/12
Y1 - 2019/12
N2 - Currently, in all augmented reality (AR) or virtual reality (VR) educational experiences, the evolution of the experience (game, exercise or other) and the assessment of the user's performance are based on her/his (re)actions which are continuously traced/sensed. In this paper, we propose the exploitation of the sensors available in the AR/VR systems to enhance the current AR/VR experiences, taking into account the users' affect state that changes in real time. Adapting the difficulty level of the experience to the users' affect state fosters their engagement which is a crucial issue in educational environments and prevents boredom and anxiety. The users' cues are processed enabling dynamic user profiling. The detection of the affect state based on different sensing inputs, since diverse sensing devices exist in different AR/VR systems, is investigated, and techniques that have been undergone validation using state-of-the-art sensors are presented.
AB - Currently, in all augmented reality (AR) or virtual reality (VR) educational experiences, the evolution of the experience (game, exercise or other) and the assessment of the user's performance are based on her/his (re)actions which are continuously traced/sensed. In this paper, we propose the exploitation of the sensors available in the AR/VR systems to enhance the current AR/VR experiences, taking into account the users' affect state that changes in real time. Adapting the difficulty level of the experience to the users' affect state fosters their engagement which is a crucial issue in educational environments and prevents boredom and anxiety. The users' cues are processed enabling dynamic user profiling. The detection of the affect state based on different sensing inputs, since diverse sensing devices exist in different AR/VR systems, is investigated, and techniques that have been undergone validation using state-of-the-art sensors are presented.
KW - Affect state detection
KW - Engagement
KW - Interpretation of interaction
KW - Multimodal affect state detection
KW - EMOTION RECOGNITION
KW - FEATURES
KW - SPEECH
U2 - 10.1007/s10055-018-0357-0
DO - 10.1007/s10055-018-0357-0
M3 - Article
SN - 1359-4338
VL - 23
SP - 399
EP - 410
JO - Virtual Reality
JF - Virtual Reality
IS - 4
ER -