Egocentric Gesture Recognition for Head-Mounted AR devices
Item Type:Conference Paper
Citation:Chalasani, T., Ondrej, J. & Smolic, A., Egocentric Gesture Recognition for Head-Mounted AR devices, Adjunct Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2018, 109-114
Natural interaction with virtual objects in AR/VR environments makes for a smooth user experience. Gestures are a natural extension from real world to augmented space to achieve these interactions. Finding discriminating spatio-temporal features relevant to gestures and hands in ego-view is the primary challenge for recognising egocentric gestures. In this work we propose a data driven end-toend deep learning approach to address the problem of egocentric gesture recognition, which combines an ego-hand encoder network to find ego-hand features, and a recurrent neural network to discern temporally discriminating features. Since deep learning networks are data intensive, we propose a novel data augmentation technique using green screen capture to alleviate the problem of ground truth annotation. In addition we publish a dataset of 10 gestures performed in a natural fashion in front of a green screen for training and the same 10 gestures performed in different natural scenes without green screen for validation. We also present the results of our network’s performance in comparison to the state-of-the-art using the AirGest dataset.
Science Foundation Ireland (SFI)
Other Titles:Adjunct Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Forthcoming.
Type of material:Conference Paper
Availability:Full text available
Keywords:Egocentric gesture recognition, Deep learning, LSTMs, Human computer interfaces, Natural gestures, Augmented reality