Learning Tactile Characterizations Of Object- And Pose-specific Grasps
Paper in proceeding, 2011

Our aim is to predict the stability of a grasp from the perceptions available to a robot before attempting to lift up and transport an object. The percepts we consider consist of the tactile imprints and the object-gripper configuration read before and until the robot's manipulator is fully closed around an object. Our robot is equipped with multiple tactile sensing arrays and it is able to track the pose of an object during the application of a grasp. We present a kernel-logistic-regression model of pose- and touch-conditional grasp success probability which we train on grasp data collected by letting the robot experience the effect on tactile and visual signals of grasps suggested by a teacher, and letting the robot verify which grasps can be used to rigidly control the object. We consider models defined on several subspaces of our input data - e.g., using tactile perceptions or pose information only. Our experiment demonstrates that joint tactile and pose-based perceptions carry valuable grasp-related information, as models trained on both hand poses and tactile parameters perform better than the models trained exclusively on one perceptual input.

Stability analysis

Visualization

Tactile sensors

Grasping

Kernel

Author

Yasemin Bekiroglu

Royal Institute of Technology (KTH)

Renaud Detry

Royal Institute of Technology (KTH)

Danica Kragic

Royal Institute of Technology (KTH)

IEEE/RSJ International Conference on Intelligent Robots and Systems

2153-0858 (ISSN) 2153-0866 (eISSN)

1554-1560
9781612844541 (ISBN)

IEEE/RSJ International Conference on Intelligent Robots and Systems
San Francisco, ,

Subject Categories

Robotics

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1109/IROS.2011.6094878

More information

Latest update

3/14/2022