Wearable Activity Recognition for Robust Human-Robot Teaming in Safety-Critical Environments Via Hybrid Neural Networks

Andrea Frank1, Alyssa Kubota2, Laurel Riek2

  • 1University of California, San Diego
  • 2University of California San Diego

Details

11:30 - 11:45 | Tue 5 Nov | LG-R12 | TuAT12.3

Session: Human Detection and Tracking

Abstract

In this work, we present a novel non-visual HAR system that achieves state-of-the-art performance on realistic SCE tasks via a single wearable sensor. We leverage surface electromyography and inertial data from a low-profile wearable sensor to attain performant robot perception while remaining unobtrusive and user-friendly. By capturing both convolutional and temporal features with a hybrid CNN-LSTM classifier, our system is able to robustly and effectively classify complex, full-body human activities with only this single sensor. We perform a rigorous analysis of our method on two datasets representative of SCE tasks, and compare performance with several prominent HAR algorithms. Results show our system substantially outperforms rival algorithms in identifying complex human tasks from minimal sensing hardware, achieving F1-scores up to 84% over 31 strenuous activity classes. To our knowledge, we are the first to robustly identify complex full-body tasks using a single, unobtrusive sensor feasible for real-world use in SCEs. Using our approach, robots will be able to more reliably understand human activity, enabling them to safely navigate sensitive, crowded spaces.