Decoding Lip Movements During Continuous Speech Using Electrocorticography

Srdjan Lesaja1, Christian Herff2, Garett Johnson1, Jerry Shih3, Tanja Schultz, Dean Krusienski4

  • 1Old Dominion University
  • 2University of Bremen
  • 3Mayo Clinic
  • 4Virginia Commonwealth University

Details

16:30 - 18:30 | Thu 21 Mar | Grand Ballroom A | ThPO.128

Session: IGNITE Session I

16:30 - 18:30 | Thu 21 Mar | Grand Ballroom B | ThPO.128

Session: Poster Session I

Abstract

Recent work has shown that it is possible to decode aspects of continuously-spoken speech from electrocorticographic (ECoG) signals recorded on the cortical surface. The ultimate objective is to develop a speech neuroprosthetic that can provide seamless, real-time synthesis of continuous speech directly from brain activity. Instead of decoding acoustic properties or classes of speech, such a neuroprosthetic might be realized by decoding articulator movements associated with speech production, as recent work highlights a representation of articulator movement in ECoG signals. The aim of this work is to investigate the neural correlates of speech-related lip movements from video recordings. We present how characteristics of lip movement can be decoded and lip-landmark positions can be predicted.