Phase Estimation for Fast Action Recognition and Trajectory Generation in Human-Robot Collaboration


This paper proposes a method to achieve fast and fluid human-robot interaction by estimating the progress of the movement of the human. The method allows the progress, also referred to as the phase of the movement,to be estimated even when observations of the human are partial and occluded; a problem typically found when using motion capture systems in cluttered environments. By leveraging on the framework of Interaction Probabilistic Movement Primitives (ProMPs), phase estimation makes it possible to classify the human action, and to generate a corresponding robot trajectory before the human finishes his/her movement. The method is therefore suited for semi-autonomous robots acting as assistants and coworkers. Since observations may be sparse, our method is based on computing the probability of different phase candidates to find the phase that best aligns the Interaction ProMP with the current observations. The method is fundamentally different from approaches based on Dynamic Time Warping (DTW) that must rely on a consistent stream of measurements at runtime. The phase estimation algorithm can be seamlessly integrated into Interaction ProMPs such that robot trajectory coordination, phase estimation, and action recognition can all be achieved in a single probabilistic framework. We evaluated the method using a 7-DoF lightweight robot arm equipped with a 5-finger hand in single and multi-task collaborative experiments. We compare the accuracy achieved by phase estimation with our previous method based on DTW.

International Journal of Robotics Research (IJRR)