top of page

FingerTrak: Continuous 3D Hand Pose Tracking by Deep Learning Hand Silhouettes Captured by Miniature Thermal Cameras on Wrist 

Published on Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)/Ubicomp’20,  ECCV'20 Best Demo Honorable Mention (Top 3 Demos)

Fang Hu, Peng He, Songlin Xu, Yin Li, Cheng Zhang

Selected Media Coverage: BBC, ForbesCornell ChronicleEngadget, Gizmodo, VentureBeat, Yahoo, VisionSystemDesign.

FingerTrak_Pic.png
hardware.png

In this paper, we present FingerTrak, an intelligent, minimally-obtrusive wristband that enables continuous 3D finger tracking and hand pose estimation with four miniature thermal cameras mounted closely on a form-fitting wristband. FingerTrak explores the feasibility of continuously reconstructing full-hand posture (20 hand joint positions) without the need to actually see the fingers directly. We demonstrate that our system is able to estimate full-hand posture by observing only the outline or the contour of the hand (hand silhouettes) from the wrist using low-resolution thermal cameras. A customized deep neural network is developed to learn to ``stitch'' these multi-view images and estimate 20 joint positions in 3D space. Our user study with 11 participants shows that the system can achieve an average angular error of 6.46 degrees when tested under the same background, and 8.06 degrees when tested under a different background. FingerTrak also demonstrates encouraging results after re-mounting the device and has the potential to reconstruct more complicated poses. We conclude this paper with further discussion on the opportunities and challenges of implementing this technology into the real world.

bottom of page