Cross-Subject Transfer Learning Improves the Practicality of Real-World Applications of Brain-Computer Interfaces

Kuan-jung Chiang1, Chun-Shu Wei, Masaki Nakanishi2, Tzyy-Ping Jung1

  • 1University of California San Diego
  • 2University of California, San Diego

Details

16:30 - 18:30 | Thu 21 Mar | Grand Ballroom A | ThPO.105

Session: IGNITE Session I

16:30 - 18:30 | Thu 21 Mar | Grand Ballroom B | ThPO.105

Session: Poster Session I

Abstract

Steady-state visual evoked potential (SSVEP)-based brain computer-interfaces (BCIs) have shown its robustness in facilitating high-efficiency communication. State-of- the-art training-based SSVEP decoding methods such as extended Canonical Correlation Analysis (CCA) and Task- Related Component Analysis (TRCA) are the major players that elevate the efficiency of the SSVEP-based BCIs through a calibration process. However, due to notable human variability across individuals and within individuals over time, calibration (training) data collection is non-negligible and often laborious and time-consuming, deteriorating the practicality of SSVEP BCIs in a real-world context. This study aims to develop a cross-subject transferring approach to reduce the need for collecting training data from a test user with a newly proposed least-squares transformation (LST) method. Study results show the capability of the LST in reducing the number of training templates required for a 40-class SSVEP BCI. The LST method may lead to numerous real-world applications using near-zero-training/plug-and-play high-speed SSVEP BCIs.