Self-Calibrated Multi-Sensor Wearable for Hand Tracking and Modeling

IEEE Transactions on Visualization and Computer Graphics
Nikhil Gosala, Fangjinhua Wang, Zhaopeng Cui, Hanxue Liang, Oliver Glauser, Shihao Wu, Olga Sorkine-Hornung

Self-Calibrated Multi-Sensor Wearable for Hand Tracking and Modeling

Our multi-sensor wearable successfully performs hand tracking and modeling with high accuracy. Our approach automatically assesses whether the hand is visible to the camera or occluded , and combines the advantages of both optical and wearable sensors accordingly.


We present a multi-sensor system for consistent 3D hand pose tracking and modeling that leverages the advantages of both wearable and optical sensors. Specifically, we employ a stretch-sensing soft glove and three IMUs in combination with an RGB-D camera. Different sensor modalities are fused based on the availability and confidence estimation, enabling seamless hand tracking in challenging environments with partial or even complete occlusion. To maximize the accuracy while maintaining high ease-of-use, we propose an automated user calibration that uses the RGB-D camera data to refine both the glove mapping model and the multi-IMU system parameters. Extensive experiments show that our setup outperforms the wearable-only approaches when the hand is in the field-of-view and outplays the camera-only methods when the hand is occluded.


accompanying video