Transferring Human Motion to Humanoid Robots
We have developed online algorithms to transfer motion from a human demonstrator to Honda's humanoid robot, ASIMO.
Transferring motion from a human demonstrator to a humanoid robot is an important step toward developing robots that are easily programmable and that can replicate or learn from observed human motion. The so called motion retargeting problem has been well studied and several off-line solutions exist based on optimization approaches that rely on pre-recorded human motion data collected from a marker-based motion capture system. From the perspective of human robot interaction, there is a growing interest in online motion transfer, particularly without using markers. Such requirements have placed stringent demands on retargeting algorithms and limited the potential use of off-line and pre-recorded methods. To address these limitations, we have developed an online task space control theoretic retargeting formulation to generate robot joint motions that adhere to the robot's balance constraints, joint limit constraints, joint velocity constraints, and self-collision constraints. The inputs to the proposed method include low dimensional normalized human motion descriptors, detected and tracked using a vision based key-point detection and tracking algorithm. The proposed vision algorithm does not rely on markers placed on anatomical landmarks, nor does it require special instrumentation or calibration. The implementation requires a depth image sequence, which is collected from a single time of flight imaging device such as a Kinect sensor. The feasibility of the proposed approach is shown by means of online experimental results on the Honda humanoid robot - ASIMO.