Learning Predictive State Representation for In-Hand Manipulation
Paper in proceeding, 2015

We study the use of Predictive State Representation (PSR) for modeling of an in-hand manipulation task through interaction with the environment. We extend the original PSR model to a new domain of in-hand manipulation and address the problem of partial observability by introducing new kernel-based features that integrate both actions and observations. The model is learned directly from haptic data and is used to plan series of actions that rotate the object in the hand to a specific configuration by pushing it against a table. Further, we analyze the model's belief states using additional visual data and enable planning of action sequences when the observations are ambiguous. We show that the learned representation is geometrically meaningful by embedding labeled action-observation traces. Suitability for planning is demonstrated by a post-grasp manipulation example that changes the object state to multiple specified target configurations.

Training

Grippers

Planning

Robot sensing systems

Kernel

History

Author

Johannes Stork

Royal Institute of Technology (KTH)

Carl Henrik Ek

Royal Institute of Technology (KTH)

Yasemin Bekiroglu

Royal Institute of Technology (KTH)

Danica Kragic

Royal Institute of Technology (KTH)

Proceedings - IEEE International Conference on Robotics and Automation

10504729 (ISSN)

3207-3214
978-147996923-4 (ISBN)

IEEE International Conference on Robotics and Automation
Seattle, USA,

Subject Categories

Robotics

Computer Science

DOI

10.1109/ICRA.2015.7139641

More information

Latest update

3/24/2022