Public Summary Month 6/2013

In May and June, the work was focused on the design and implementation of a movement gesture concept and corresponding task planning developments. Moreover an object recognition scheme was implemented for seamless gesture based object selection.              
Also, the ideas for the two stage gesture classification approach and the whole Human Robot Collaboration (HRC) system were presented at a workshop at the Robotics: Science and Systems conference in Berlin.


Public Summary Month 4/2013

In March and April, the work was focused on the further development of the command gesture approach and the task planning module. Also, the ideas for the robust command gesture classification were submitted as a paper at a workshop at the Robotics: Science and Systems conference in Berlin.       
The previous command gesture approach was enhanced with a second processing stage, in order to deal with the high false positives rate of the gesture recognition.


Public Summary Month 2/2013

In January and February, the work was focused on the development of probabilistic methods for the gesture based human robot interaction (tasks 4), and the task planning and task dependent adaption of robot motion planning (task 5). Also the deliverables 2.1 and 2.2 were completed.          
For the hand gesture analysis, a probabilistic hidden markov model based system was designed, developed, and integrated into the HRC framework.           
Also, a simple task planning module was designed and implemented, which processes the joint scene analyses of HMM-based action recognition and DL-based situation awareness.


Public Summary Month 12/2012

In November and December the focus lay on tasks 4 and 5 for the IPR, in which gesture and action recognition and path planning was to be incorporated into the framework. For task 4, the conceptual design of the gesture based interaction was determined and two gesture classes were identified. In task 5 especially the path planning module was developed and incorporated into the experiment framework.
Project partner Reis Robotics checked the defined 4 experiments concerning safety issues and made a risk analysis involving a Reis safety expert. KUKA worked out an alternative scenario for experiment 5, involving gesture based interaction with their mobile platform.


Public Summary Month 10/2012

In September and October 2012, the work was focused on gesture and action recognition. So far, it is based on Description Logics (DLs) which uses a taxonomy about actions, activities and gestures as a knowledge base. Recognition results are inferred directly by providing assertional knowledge which state information about human kinematics and robotic state. Moreover, preliminary linking concepts have been included into the knowledge base in order to allow for incorporation of stochastical methods for action and gesture recognition.