A virtual reality implementation of the Attention Network Test-Revised

David Tekampe*, Anhela Sulaj*, Lars Hausfeld*, Michael Schwartze*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademic

Abstract

The widespread availability of high-quality virtual reality (VR) headsets allows integration into neuropsychological settings to develop and optimize cognitive assessment methods. This study aimed to introduce an adapted VR version of the Attention Network Test-Revised (ANT-R; Fan et al., 2009), a well-established computerized neuropsychological assessment tool that differentiates between operationally defined alerting, orienting, and executive aspects of attention, and to test its construct validity. Reaction time and accuracy performance of participants (N = 40) was compared between computerized and VR versions (ANT-VR). Testing took place in a confined laboratory space, with the ANTVR virtual environment resembling an everyday home setting delivered through a standalone headset (MetaQuest 2). Participants additionally completed the Quality of Experience Test (Brunnström et al., 2020). Results indicated good utilization of the headset during the testing. The main analyses revealed some notable differences such as overall longer reaction times and a smaller orienting effect for the ANT-VR. However, a comparable result pattern and correlated network scores confirmed that alerting, orienting, and executive control attention networks were adequately assessed and differentiated by the ANT-VR. The ANT-VR thus instantiates a viable mobile and ecologically more valid alternative without compromising experimental control in terms of accurate and reliable data collection.
Original languageEnglish
Title of host publicationProceedings of the 1st AUDICTIVE Conference
Subtitle of host publicationJune 19-22, 2023, RWTH Aachen University, Aachen, Germany
EditorsBalint Jamilla, Fels Janina
Place of PublicationAachen
PublisherRWTH Aachen University
Pages142-145
Number of pages4
Volume1
DOIs
Publication statusPublished - 19 Jun 2023

Cite this