Home // ICAS 2020, The Sixteenth International Conference on Autonomic and Autonomous Systems // View article
Imitating Task-oriented Grasps from Human Demonstrations with a Low-DoF Gripper
Authors:
Timothy Patten
Markus Vincze
Keywords: Robotic grasping; task-oriented grasping; learning from demonstration; imitation learning; deep learning
Abstract:
Task-oriented or semantic grasping is important in robotics because it enables objects to be manipulated appropriately and used for their intended purpose. Many objects are human designed, therefore, we address the problem of learning task-oriented grasps by directly observing human behaviour. A person simply demonstrates the appropriate grasp, which is quick and convenient for any user in the real world. Our approach uses RGB images to track the object and hand pose, then employs a neural network to translate the human hand configuration to a robotic grasp with fewer degrees of freedom. Analysis shows that a variety of low-dimensional representations of the hand enable the mapping to be learned and that the model better generalises to new demonstrators handling new objects when the training data is augmented. Experiments with a mobile manipulator show that a robot successfully observes grasps and imitates the action on objects in various poses. This is accomplished immediately, without additional learning and is robust in real-world conditions.
Pages: 80 to 86
Copyright: Copyright (c) IARIA, 2020
Publication date: September 27, 2020
Published in: conference
ISSN: 2308-3913
ISBN: 978-1-61208-787-0
Location: Lisbon, Portugal
Dates: from September 27, 2020 to October 1, 2020