Learning manipulation graphs from demonstrations using multimodal sensory signals

Z Su, O Kroemer, GE Loeb… - … on robotics and …, 2018 - ieeexplore.ieee.org
2018 IEEE international conference on robotics and automation (ICRA), 2018ieeexplore.ieee.org
Complex contact manipulation tasks can be decomposed into sequences of motor
primitives. Individual primitives often end with a distinct contact state, such as inserting a
screwdriver tip into a screw head or loosening it through twisting. To achieve robust
execution, the robot should be able to verify that the primitive's goal has been reached as
well as disambiguate it from erroneous contact states. In this paper, we introduce and
evaluate a framework to autonomously construct manipulation graphs from manipulation …
Complex contact manipulation tasks can be decomposed into sequences of motor primitives. Individual primitives often end with a distinct contact state, such as inserting a screwdriver tip into a screw head or loosening it through twisting. To achieve robust execution, the robot should be able to verify that the primitive's goal has been reached as well as disambiguate it from erroneous contact states. In this paper, we introduce and evaluate a framework to autonomously construct manipulation graphs from manipulation demonstrations. Our manipulation graphs include sequences of motor primitives for performing a manipulation task as well as corresponding contact state information. The sensory models for the contact states allow the robot to verify the goal of each motor primitive as well as detect erroneous contact changes. The proposed framework was experimentally evaluated on grasping, unscrewing, and insertion tasks on a Barrett arm and hand equipped with two BioTacs. The results of our experiments indicate that the learned manipulation graphs achieve more robust manipulation executions by confirming sensory goals as well as discovering and detecting novel failure modes.
ieeexplore.ieee.org
Showing the best result for this search. See all results