Eating Activities Detection using Wrist based Wearable Sensors

Md Abu Sayeed Mondol1, Ridwan Alam1, Nutta Homdee1, Matthew Ridder1, Donna Spruijt-Metz2, Kayla de la Haye2, John Lach1, John Stankovic1

  • 1University of Virginia
  • 2University of Southern California

Details

08:30 - 19:30 | Wed 26 Oct | Auditorium Foyer | WePOS.21

Session: Poster Session

08:30 - 19:30 | Wed 26 Oct | Main Auditorium | WePOS.21

Session: Ignite Session 2

Abstract

Background: Monitoring eating behavior plays a vital role in addressing healthcare challenges like obesity, diabetes, and cardiovascular health [1][2][3]. Robustness and accuracy of state of the art techniques ([4][5][6]) need to be improved by addressing challenges associated with eating detection. Purpose: The purpose of the study is to design and develop an eating detection technique using wrist worn sensors that is more robust and accurate than current state of the art techniques. Methods: Time series sensor data from wrist worn sensors are segmented for eating gesture detection. In contrast to fixed length segments which are used by most existing methods [1], our approach extracts potential segments that are variable in lengths and better represent the sporadic occurrence and variable durations of eating gestures. In this work, we use gravitational acceleration instead of total acceleration, because the former one better captures device orientation during hand movement. We develop a novel two-step method for eating gesture detection from the stream of sensor data. In the first step most non-eating gesture data are discarded using a computationally efficient threshold based technique. This early pruning reduces computation requirements, and improves overall accuracy. In the second step, eating gestures are detected from the selected segments through feature extraction followed by a classification technique. In addition to using features from the whole segment, our technique also separates each segment into two parts, hand movements towards and hand movements away from the mouth, and then utilizes features from both parts. Inertial data are collected during ten meals from five subjects wearing Android powered Sony smart watches on the dominant wrist. In this study, all subjects are right handed. The dominant hand is used because people usually eat mostly using that hand, and wearing two devices on two wrists was deemed too high of a participant burden. In the study, acquired video data provides ground truth while subjects ate naturally. The total duration of all the meals was about 2 hours which includes a total of 508 bites. Various food items (chicken, cake, yogurt, ice cream, banana, chips, milk, and water) were consumed using bare hands, utensils (forks, spoons, knives), mugs, and bottles. Non-eating activity data in natural contexts were logged for about 11 hours from the subjects. A total of 36 features were extracted for each segment from all axes of gravitational acceleration. Minimum and maximum values were calculated from whole segment. Mean, standard deviation, covariance, skewness and kurtosis were extracted from each of the two parts of the segments. The random forest method was used for classification. Results: 10-fold cross validation method is applied to evaluate the performance of the classification method. Our technique achieves a F1-score of 80.63% which is about 10% higher than the person dependent accuracy reported in [4]. Conclusions: This paper presents a novel method for eating gesture detection using wrist worn sensors. Results from this preliminary study demonstrate the potential of the proposed technique in detecting eating gestures in realistic settings with more accuracy.