TY - GEN
T1 - When generalized eating detection machine learning models fail in the field
AU - Zhang, Shibo
AU - Alharbi, Rawan
AU - Nicholson, Matthew
AU - Alshurafa, Nabil
N1 - Publisher Copyright:
Copyright © 2017 ACM.
Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2017/9/11
Y1 - 2017/9/11
N2 - Problematic eating behaviors are a major cause of obesity. To improve our understanding of these eating behaviors, we need to be able to first reliably detect them. In this paper we use a wrist-worn sensor to test a generalized machine learning models' reliability in detecting eating episodes through data processing. We process data from a 6-axis inertial sensor. Since most eating episodes do not occur while moving, we filter out periods of physical activity, and then use an advanced motif-based time-point fusion technique to detect feeding gestures. We also cluster each of the false alarms into four categories in an effort to identify the main behaviors that confound feeding gesture detection. We tested our system on eight participants performing various activities in the wild while wearing a sensing suite: a neck- and a wrist-worn sensor, along with a wearable video camera continuously recording to capture ground truth. Trained annotators further validated the algorithms by identifying feeding gestures, and categorized the false alarms. All eating episodes were detected; however, many false alarms were also detected, yielding a 61% average F-measure in detecting feeding gestures. This result shows clear challenges in characterizing eating episodes by using a single inertial-based wrist-worn sensor.
AB - Problematic eating behaviors are a major cause of obesity. To improve our understanding of these eating behaviors, we need to be able to first reliably detect them. In this paper we use a wrist-worn sensor to test a generalized machine learning models' reliability in detecting eating episodes through data processing. We process data from a 6-axis inertial sensor. Since most eating episodes do not occur while moving, we filter out periods of physical activity, and then use an advanced motif-based time-point fusion technique to detect feeding gestures. We also cluster each of the false alarms into four categories in an effort to identify the main behaviors that confound feeding gesture detection. We tested our system on eight participants performing various activities in the wild while wearing a sensing suite: a neck- and a wrist-worn sensor, along with a wearable video camera continuously recording to capture ground truth. Trained annotators further validated the algorithms by identifying feeding gestures, and categorized the false alarms. All eating episodes were detected; however, many false alarms were also detected, yielding a 61% average F-measure in detecting feeding gestures. This result shows clear challenges in characterizing eating episodes by using a single inertial-based wrist-worn sensor.
KW - Hand-to-mouth gestures
KW - Wearables
KW - Wrist-worn sensors
UR - http://www.scopus.com/inward/record.url?scp=85030847057&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85030847057&partnerID=8YFLogxK
U2 - 10.1145/3123024.3124409
DO - 10.1145/3123024.3124409
M3 - Conference contribution
AN - SCOPUS:85030847057
T3 - UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers
SP - 613
EP - 622
BT - UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers
PB - Association for Computing Machinery, Inc
T2 - 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and ACM International Symposium on Wearable Computers, UbiComp/ISWC 2017
Y2 - 11 September 2017 through 15 September 2017
ER -