An Augmented-Reality fNIRS-Based Brain-Computer Interface: A Proof-of-Concept Study

Amaia Benitez-Andonegui*, Rodion Burden, Richard Benning, Rico Möckel, Michael Lührs, Bettina Sorger

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

6 Citations (Web of Science)

Abstract

Augmented reality (AR) enhances the user's environment by projecting virtual objects into the real world in real-time. Brain-computer interfaces (BCIs) are systems that enable users to control external devices with their brain signals. BCIs can exploit AR technology to interact with the physical and virtual world and to explore new ways of displaying feedback. This is important for users to perceive and regulate their brain activity or shape their communication intentions while operating in the physical world. In this study, twelve healthy participants were introduced to and asked to choose between two motor-imagery tasks: mental drawing and interacting with a virtual cube. Participants first performed a functional localizer run, which was used to select a single fNIRS channel for decoding their intentions in eight subsequent choice-encoding runs. In each run participants were asked to select one choice of a six-item list. A rotating AR cube was displayed on a computer screen as the main stimulus, where each face of the cube was presented for 6 s and represented one choice of the six-item list. For five consecutive trials, participants were instructed to perform the motor-imagery task when the face of the cube that represented their choice was facing them (therewith temporally encoding the selected choice). In the end of each run, participants were provided with the decoded choice based on a joint analysis of all five trials. If the decoded choice was incorrect, an active error-correction procedure was applied by the participant. The choice list provided in each run was based on the decoded choice of the previous run. The experimental design allowed participants to navigate twice through a virtual menu that consisted of four levels if all choices were correctly decoded. Here we demonstrate for the first time that by using AR feedback and flexible choice encoding in form of search trees, we can increase the degrees of freedom of a BCI system. We also show that participants can successfully navigate through a nested menu and achieve a mean accuracy of 74% using a single motor-imagery task and a single fNIRS channel.

Original languageEnglish
Article number346
Number of pages21
JournalFrontiers in Neuroscience
Volume14
DOIs
Publication statusPublished - 28 Apr 2020

Keywords

  • hemodynamic brain-computer interface
  • augmented reality
  • motor imagery
  • real-time analysis
  • temporal information encoding
  • user-centered approach
  • INFRARED SPECTROSCOPY SIGNALS
  • MOTOR IMAGERY
  • ACTUATED WHEELCHAIR
  • SPELLING DEVICE
  • CLASSIFICATION
  • INTERFERENCE
  • PERFORMANCE
  • NEUROFEEDBACK
  • ACTIVATION
  • QUESTIONS

Cite this