Abstract
Obesity, caused primarily by overeating, is a preventable chronic disease yielding staggering healthcare costs. To detect overeating passively, a machine learning framework was designed to detect and accurately count the number of feeding gestures during an eating episode to characterize each eating episode with a feeding gesture count. With the ubiquitous nature of wrist-worn sensors, existing literature has focused on detecting eating-related gestures and eating episodes that are at least five minutes long. In this paper, our objective is to show the potential of commercial smartwatches to be used in detection of eating episodes with short durations confounded by other activities of daily living in order to truly capture all eating episodes in the field. The effect of time-series segmentation and sensing configurations on the accuracy of detecting and characterizing feeding gestures is then analyzed. Finally, the effects of personalized and generalized machine learning models in predicting feeding gestures are compared. Results demonstrate the large within-subject variability of eating, where a generalized user-independent model yields a 75.7% average F-measure, whereas a personalized userdependent model yields a 85.7% average F-measure. This shows the effects of personalized clustering on feeding gesture count, resulting in a root mean square error of 8.4.
Original language | English (US) |
---|---|
Journal | BodyNets International Conference on Body Area Networks |
DOIs | |
State | Published - Jan 1 2017 |
Event | 11th International Conference on Body Area Networks, BODYNETS 2016 - Turin, Italy Duration: Dec 15 2016 → Dec 16 2016 |
Keywords
- Hand-to-mouth gestures
- Inertial sensors
- Overeating
- Wearables
- Wrist-worn sensors
ASJC Scopus subject areas
- Artificial Intelligence
- Computer Networks and Communications
- Computer Science Applications