See, Hear, or Feel - to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions

Laurien Nagels-Coune*, Lars Riecke, Amaia Benitez-Andonegui, Simona Klinkhammer, Rainer Goebel, Peter De Weerd, Michael Lührs, Bettina Sorger

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Severely motor-disabled patients, such as those suffering from the so-called "locked-in" syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability.

Original languageEnglish
Article number784522
Number of pages19
JournalFrontiers in Human Neuroscience
Volume15
DOIs
Publication statusPublished - 25 Nov 2021

Keywords

  • ACTIVATION
  • CLASSIFICATION
  • COMMUNICATION
  • CORTEX
  • FNIRS
  • IMAGING INSTRUMENTATION
  • MOTOR IMAGERY
  • STATE
  • SYSTEM
  • TECHNOLOGY
  • brain-computer interface (BCI)
  • four-choice communication
  • functional near-infrared spectroscopy (fNIRS)
  • mental drawing
  • motor imagery (MI)
  • reliability over time
  • sensory encoding modality
  • temporal encoding

Cite this