Grasp Transfer based on Self-Aligning Implicit Representations of Local Surfaces

Abstract

Objects we interact with and manipulate often share similar parts, e.g. handles, that allow us to transfer our actions flexibly due to their shared functionality. This work addresses the problem of transferring grasp experience or demonstration to a novel object that shares shape similarities with objects a robot has previously experienced. Existing approaches to solving this problem are typically restricted to a specific object category or a parametric shape. Our approach, however, can transfer grasps associated with implicit models of local shapes shared across object categories. Specifically, we employ a single expert grasp demonstration during training to learn a local implicit surface representation model. At inference time, this model is utilized to transfer grasps to novel objects by identifying the most similar-looking surfaces to the one on which the expert grasp is demonstrated. Our model is trained entirely in simulation and is evaluated on simulated and real-world objects that are not seen during training. Simulation results show that our method acquires better spatial precision and grasp accuracy compared to the baselines. Moreover, our method can successfully perform grasp transfer to unseen object categories, as shown in both simulation and real-world experiments.

Publication
IEEE Robotics and Automation Letters (RA-L)
Yasemin Bekiroğlu
Yasemin Bekiroğlu
Senior Research Fellow (08/2021 - 07/2023)