Nov 25, 2011 , by
Public Summary Month 11/2011
An ad-hoc pointing detection and parameter extraction methods has been implemented on ROS, running real-time and on-line. We have implemented an ad-hoc version to be able to focus on implementational issues. The adhoc poiting recognition will be replaced by a parametric HMM within the next reporting period.
Steps during the next reporting period:
Next upcoming even it the deliverable in M10 where we will demonstrate the first integrated version of our system that is able to track and recognize human pointing gestures. For this, it is necessary to be able to 1) use the Kinect camera to track the human (head and hand in particular), 2) to identify the pointing gesture and to where the human is pointing, and 3) to calibrate the robot arm with the Kinect camera so that the robot is able to move its endeffector to the object at which the human was pointing. In the demo-derliverable, the robot will be able to touch the object at which the human was pointing.
The M10 deliverable will be a first demo running on the Little Helper Plus. It will be built directly on the present setup of our robot.
Steps during the next reporting period:
Next upcoming even it the deliverable in M10 where we will demonstrate the first integrated version of our system that is able to track and recognize human pointing gestures. For this, it is necessary to be able to 1) use the Kinect camera to track the human (head and hand in particular), 2) to identify the pointing gesture and to where the human is pointing, and 3) to calibrate the robot arm with the Kinect camera so that the robot is able to move its endeffector to the object at which the human was pointing. In the demo-derliverable, the robot will be able to touch the object at which the human was pointing.
The M10 deliverable will be a first demo running on the Little Helper Plus. It will be built directly on the present setup of our robot.
Tags:
public summary