Learning Predictive State Representation for In-Hand Manipulation
Paper in proceedings, 2015

We study the use of Predictive State Representation (PSR) for modeling of an in-hand manipulation task through interaction with the environment. We extend the original PSR model to a new domain of in-hand manipulation and address the problem of partial observability by introducing new kernel-based features that integrate both actions and observations. The model is learned directly from haptic data and is used to plan series of actions that rotate the object in the hand to a specific configuration by pushing it against a table. Further, we analyze the model's belief states using additional visual data and enable planning of action sequences when the observations are ambiguous. We show that the learned representation is geometrically meaningful by embedding labeled action-observation traces. Suitability for planning is demonstrated by a post-grasp manipulation example that changes the object state to multiple specified target configurations.

Planning

Training

Grippers

History

Robot sensing systems

Kernel

Author

Johannes Stork

Royal Institute of Technology (KTH)

Carl Henrik Ek

University of Cambridge

Yasemin Bekiroglu

Chalmers, Signals and Systems, Systems and control, Automatic Control

Danica Kragic

Royal Institute of Technology (KTH)

Proceedings - IEEE International Conference on Robotics and Automation

10504729 (ISSN)

IEEE International Conference on Robotics and Automation
Seattle, ,

Subject Categories

Robotics

Computer Science

More information

Created

9/2/2020 1