Kinect-based gesture tracking
With the advent of Kinect (and related technologies), minority report style scenarios where users interact with computing devices using freeform hand gestures (or other body movements) have become a reality. However, while hand gestures provide a natural way to interact with the environment, they can be difficult for a computer to understand as the movements seldom are exactly (or at least easily) repeatable [1] . The aim of this thesis is to develop and evaluate a robust gesture tracking system on top of Kinect, and to quantify the complexity and repeatability of the gestures that can be supported using the system [2].
References:
[1] Oulasvirta, A.; Roos, T.; Modig, A. & Leppänen, L. Information capacity of full-body movements Proceedings of the 2013 ACM annual conference on Human factors in computing systems (CHI), ACM, 2013, 1289-1298
[2] Ren, Z.; Meng, J.; Yuan, J. & Zhang, Z. Robust hand gesture recognition with Kinect sensor Proceedings of the 19th ACM international conference on Multimedia (MM), ACM, 2011, 759-760