SoundTrak: Continuous 3D Finger Tracking Using Active Acoustics
Published on Proc. ACM Interact. Mob. Wearable UbiquitousTechnol. 1, 2, Article 30 (June 2017) / Ubicomp 2017.
Cheng Zhang, Qiuyue Xue, Anandghan Waghmare, Sumeet Jain, Yiming Pu Sinan Hersek, Kent Lyons, Kenneth A. Cunefare, Omer T.Inan, Gregory D. Abowd
The small size of wearable devices limits the efficiency and scope of possible user interactions, as inputs are typically constrained to two dimensions: the touchscreen surface. We present SoundTrak, an active acoustic sensing technique that enables a user to interact with wearable devices in the surrounding 3D space by continuously tracking the finger position with high resolution. The user wears a ring with an embedded miniature speaker sending an acoustic signal at a specific frequency (e.g., 11 kHz), which is captured by an array of miniature, inexpensive microphones on the target wearable device. A novel algorithm is designed to localize the finger’s position in 3D space by extracting phase information from the received acoustic signals. We evaluated SoundTrak in a volume of space (20cm x 16cm x 11cm) around a smartwatch, and show an average accuracy of 1.3 cm. We report on results from a Fitts’ Law experiment with 10 participants as the evaluation of the real-time prototype. We also present a set of applications which are supported by this 3D input technique, and show the practical challenges that need to be addressed before widespread use.