Learning Tactile Characterizations Of Object- And Pose-specific Grasps
Paper i proceeding, 2011

Our aim is to predict the stability of a grasp from the perceptions available to a robot before attempting to lift up and transport an object. The percepts we consider consist of the tactile imprints and the object-gripper configuration read before and until the robot's manipulator is fully closed around an object. Our robot is equipped with multiple tactile sensing arrays and it is able to track the pose of an object during the application of a grasp. We present a kernel-logistic-regression model of pose- and touch-conditional grasp success probability which we train on grasp data collected by letting the robot experience the effect on tactile and visual signals of grasps suggested by a teacher, and letting the robot verify which grasps can be used to rigidly control the object. We consider models defined on several subspaces of our input data - e.g., using tactile perceptions or pose information only. Our experiment demonstrates that joint tactile and pose-based perceptions carry valuable grasp-related information, as models trained on both hand poses and tactile parameters perform better than the models trained exclusively on one perceptual input.

Stability analysis

Visualization

Tactile sensors

Grasping

Kernel

Författare

Yasemin Bekiroglu

Kungliga Tekniska Högskolan (KTH)

Renaud Detry

Kungliga Tekniska Högskolan (KTH)

Danica Kragic

Kungliga Tekniska Högskolan (KTH)

IEEE/RSJ International Conference on Intelligent Robots and Systems

2153-0858 (ISSN) 2153-0866 (eISSN)

1554-1560
9781612844541 (ISBN)

IEEE/RSJ International Conference on Intelligent Robots and Systems
San Francisco, ,

Ämneskategorier

Robotteknik och automation

Datorseende och robotik (autonoma system)

DOI

10.1109/IROS.2011.6094878

Mer information

Senast uppdaterat

2022-03-14