HandyTrack: HPC for Hand Gesture Dataset Generation and Deep Learning Training for Detection and Tracking
Presentation of the problem and objective of the experiment
In AR, it is necessary to manage virtual 3D objects in space. In real life, we use our hands which have 6 DoF and fingers for fine movements. The need for touchless interaction creates an opportunity to take advantage of hand movements and gestures.
The challenge is how to address the opportunity of hand tracking on existing computational devices with low power consumption and 0 latency. The goal of this HPC experiment is to design and develop a software library based on Machine Learning techniques to perform dynamic detection and recognition of hand gestures without interruption.
Short description of the experiment
HandyTrack - HPC for Hand Gesture Dataset Generation and Deep Learning Training for Detection & Tracking is dedicated to Artificial Intelligence and specifically to Computer Vision and Machine Learning applied to the use of HPC for training neural networks and for generating data based on 3D models in order to reduce the time required for training algorithms and allow for more accurate performance.
The experiment aims to generate a large video data set for hand gestures using realistic 3D hand models as well as behavior trees to generate gesture variations. Hence, it aims to train machine learning algorithms for hand gesture recognition. HPC will allow this process to take much less time than classical methods.
Partners CINECA and BI-REX are part of the NCC Italy.