Decoding Lip Movements During Continuous Speech using Electrocorticography

S. Lesaja*, C. Herff, G.D. Johnson, J.J. Shih, T. Schultz, D.J. Krusienski

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

1 Citation (Web of Science)
2 Downloads (Pure)

Abstract

Recent work has shown that it is possible to decode aspects of continuously-spoken speech from electrocorticographic (ECoG) signals recorded on the cortical surface. The ultimate objective is to develop a speech neuroprosthetic that can provide seamless, real-time synthesis of continuous speech directly from brain activity. Instead of decoding acoustic properties or classes of speech, such a neuroprosthetic might be realized by decoding articulator movements associated with speech production, as recent work highlights a representation of articulator movement in ECoG signals. The aim of this work is to investigate the neural correlates of speech-related lip movements from video recordings. We present how characteristics of lip movement can be decoded and lip-landmark positions can be predicted.
Original languageEnglish
Title of host publication2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER)
PublisherIEEE Xplore
Pages522-525
Number of pages4
DOIs
Publication statusPublished - 1 Jan 2019

Cite this