Object and relation centric representations for push effect prediction
Journal article, 2024

Pushing is an essential non-prehensile manipulation skill used for tasks ranging from pre-grasp manipulation to scene rearrangement, reasoning about object relations in the scene, and thus pushing actions have been widely studied in robotics. The effective use of pushing actions often requires an understanding of the dynamics of the manipulated objects and adaptation to the discrepancies between prediction and reality. For this reason, effect prediction and parameter estimation with pushing actions have been heavily investigated in the literature. However, current approaches are limited because they either model systems with a fixed number of objects or use image-based representations whose outputs are not very interpretable and quickly accumulate errors. In this paper, we propose a graph neural network based framework for effect prediction and parameter estimation of pushing actions by modeling object relations based on contacts or articulations. Our framework is validated both in real and simulated environments containing different shaped multi-part objects connected via different types of joints and objects with different masses, and it outperforms image-based representations on physics prediction. Our approach enables the robot to predict and adapt the effect of a pushing action as it observes the scene. It can also be used for tool manipulation with never-seen tools. Further, we demonstrate 6D effect prediction in the lever-up action in the context of robot-based hard-disk disassembly.

Effect prediction

Interactive perception

Articulation prediction

Graph neural networks

Push manipulation

Parameter estimation

Author

Ahmet Ercan Tekden

Chalmers, Electrical Engineering, Systems and control

Aykut Erdem

Koç University

Erkut Erdem

Hacettepe University

Tamim Asfour

Karlsruhe Institute of Technology (KIT)

Emre Ugur

Bogazici University

Robotics and Autonomous Systems

0921-8890 (ISSN)

Vol. 174 104632

Subject Categories

Robotics

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1016/j.robot.2024.104632

More information

Latest update

3/26/2024