We contribute a novel user- and activity-independent kinematics-based regressive model for continuously predicting ballistic hand movements in virtual reality (VR).
Compared to prior work on end-point prediction, continuous hand trajectory prediction in VR enables an early estimation of future events such as collisions between the user’s hand and virtual objects such as UI widgets.
We developed and validated our prediction model through a user study with 20 participants.
The study collected hand motion data with a 3D pointing task and a gaming task with three popular VR games.
Results show that our model can achieve a low Root Mean Square Error (RMSE) of 0.80cm, 0.85cm and 3.15cm from future hand positions ahead of 100ms, 200ms and 300ms respectively across all the users and activities.
In pointing tasks, our predictive model achieves an average angular error of 4.0° and 1.5° from the true landing position when 50% and 70% of the way through the movement.
A follow-up study showed that the model can be applied to new users and new activities without further training.
Nisal Menuka Gamage, Deepana Ishtaweera, Martin Weigel, and Anusha Withana
So Predictable! Continuous 3D Hand Trajectory Prediction in Virtual Reality
In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology (UIST '21).
Project Page –