Eating Gestures Detection by Tracking Finger Motion

Dawei Fan1, Jiaqi Gong2, John Lach1

  • 1University of Virginia
  • 2University Maryland, Baltimore County

Details

09:45 - 10:00 | Wed 26 Oct | Main Auditorium | WeAT1.2

Session: Technical Session 1: Situational and Context Aware Behavioral Monitoring

Abstract

Research interests in individuals' eating habits are growing recently since unhealthy eating habits are highly related to various diseases such as obesity, diabetes, and cardiovascular diseases. Monitoring people's eating behavior provides opportunities to give feedbacks and suggestions towards healthy eating habits. Body worn inertial sensors are gaining popularity to provide a good solution for eating behavior recognition since they are convenient for subjects to wear in daily life. In this paper, a novel approach of detecting eating gestures by tracking finger motion is proposed. A dataset of 375 gestures, doing 7 activities is created for training and testing classifiers. A Teager energy based algorithm is designed for segmentation of gesture series. Seven state-of-the-art learning methods are tested to make binary (eating/non-eating) or multiclass (seven classes) classification. Accelerometer and gyroscope datasets are tested separately and compared. The results show that for finger motion data, K Nearest Neighbor (KNN) performs best and achieves 97.1% accuracy in binary classification, which performs better than on wrist motion dataset. The results indicate finger motion is an effective indicator for classifying eating and non-eating behaviors.