TY - GEN
T1 - ActiSight
T2 - 20th IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
AU - Alharbi, Rawan
AU - Sen, Sougata
AU - Ng, Ada
AU - Alshurafa, Nabil
AU - Hester, Josiah
N1 - Funding Information:
This material is based upon work supported by the National Science Foundation (NSF) under award number CNS1915847, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) under award numbers K25DK113242 and R03DK127128, and National Institute of Biomedical Imaging and Bioengineering (NIBIB) under award number R21EB030305. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or the National Institutes of Health.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Wearable cameras provide an informative view of wearer activities, context, and interactions. Video obtained from wearable cameras is useful for life-logging, human activity recognition, visual confirmation, and other tasks widely utilized in mobile computing today. Extracting foreground information related to the wearer and separating irrelevant background pixels is the fundamental operation underlying these tasks. However, current wearer foreground extraction methods that depend on image data alone are slow, energy-inefficient, and even inaccurate in some cases, making many tasks-like activity recognition-challenging to implement in the absence of significant computational resources. To fill this gap, we built ActiSight, a wearable RGB-Thermal video camera that uses thermal information to make wearer segmentation practical for body-worn video. Using ActiSight, we collected a total of 59 hours of video from 6 participants, capturing a wide variety of activities in a natural setting. We show that wearer foreground extracted with ActiSight achieves a high dice similarity score while significantly lowering execution time and energy cost when compared with an RGB-only approach.
AB - Wearable cameras provide an informative view of wearer activities, context, and interactions. Video obtained from wearable cameras is useful for life-logging, human activity recognition, visual confirmation, and other tasks widely utilized in mobile computing today. Extracting foreground information related to the wearer and separating irrelevant background pixels is the fundamental operation underlying these tasks. However, current wearer foreground extraction methods that depend on image data alone are slow, energy-inefficient, and even inaccurate in some cases, making many tasks-like activity recognition-challenging to implement in the absence of significant computational resources. To fill this gap, we built ActiSight, a wearable RGB-Thermal video camera that uses thermal information to make wearer segmentation practical for body-worn video. Using ActiSight, we collected a total of 59 hours of video from 6 participants, capturing a wide variety of activities in a natural setting. We show that wearer foreground extracted with ActiSight achieves a high dice similarity score while significantly lowering execution time and energy cost when compared with an RGB-only approach.
KW - Wearable cameras
KW - in wild
KW - thermal
UR - http://www.scopus.com/inward/record.url?scp=85129934205&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129934205&partnerID=8YFLogxK
U2 - 10.1109/PerCom53586.2022.9762385
DO - 10.1109/PerCom53586.2022.9762385
M3 - Conference contribution
AN - SCOPUS:85129934205
T3 - 2022 IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
SP - 237
EP - 246
BT - 2022 IEEE International Conference on Pervasive Computing and Communications, PerCom 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 March 2022 through 25 March 2022
ER -