EatingTrak: Detecting Fine-grained Eating Moments in the Wild Using a Wrist-mounted IMU
Published in The ACM International Conference on Mobile Human-Computer Interaction, MobileHCI 2022
Ruidong Zhang, Jihai Zhang, Nitish Gade, Peng Cao, Seyun Kim, Junchi Yan, Cheng Zhang
In this paper, we present EatingTrak, an AI-powered sensing system using a wrist-mounted inertial measurement unit (IMU) to recognize eating moments in a near-free-living semi-wild setup. It significantly improves the SOTA in time resolution using similar hardware on identifying eating moments, from over five minutes to three seconds. Different from prior work which directly learns from raw IMU data, it proposes intelligent algorithms which can estimate the arm posture in 3D in the wild and then learns the detailed eating moments from the series of estimated arm postures. To evaluate the system, we collected eating activity data from 9 participants in semi-wild scenarios for over 113 hours. Results showed that it was able to recognize eating moments at three time-resolutions: 3 seconds and 15 minutes with F-1 scores of 73.7% and 83.8%, respectively. EatingTrak would introduce new opportunities in sensing detailed eating behavior information requiring high time resolution, such as eating frequency, snack-taking, on-site behavior intervention. We also discuss the opportunities and challenges in deploying EatingTrak on commodity devices at scale.