Decoder for Switching State-Space Models with Spike-Field Observations

Christian Song1, Han-Lin Hsieh, Maryam Shanechi1

  • 1University of Southern California

Details

16:30 - 18:30 | Thu 21 Mar | Grand Ballroom B | ThPO.50

Session: Poster Session I

Abstract

The dynamics of brain activity are inherently non-stationary and can change depending on context, for example based on the tasks performed, the stimuli received, or the attention focus maintained. Thus to decode brain states in naturalistic scenarios, it is necessary to track such changes. Further, it is becoming increasingly common to measure the brain at multiple spatiotemporal scales by recording both spike and field activities simultaneously. Thus tracking non-stationarity necessitates efficient decoders that can detect changes in spike-field neural dynamics and accurately estimate the underlying neural and behavioral states. Here, we develop a new decoding framework to address this challenge. We build a multiscale switching dynamical model that assumes the underlying hiddenneural state can evolve with dynamics chosen from an arbitrary finite set. The choice of dynamics could for example be dictated by a higher-level cognitive state such as attention level, which we call a switch state. This switch state would also dictate how the neural state is represented in the recorded binary spike events and continuous field signals. We derive a new multiscale decoder that simultaneously estimates the underlying neural and switch states from these spike-field observations. We show with closed-loop simulations that our new decoder is able to accurately estimate the switch state while accurately decoding the neural and behavioral states.