Joint Observation of Object Pose and Tactile Imprints for Online Grasp Stability Assessment
Conference poster, 2011
This paper studies the viability of concurrent object pose tracking and tactile sensing for assessing grasp stability on a physical robotic platform. We present a kernel logistic regression model of pose- and touch-conditional grasp success probability. Models are trained on grasp data which consist of (1) the pose of the gripper relative to the object, (2) a tactile description of the contacts between the object and the fully-closed gripper, and (3) a binary description of grasp feasibility, which indicates whether the grasp can be used to rigidly control the object. The data is collected by executing grasps demonstrated by a human on a robotic platform composed of an industrial arm, a three-finger gripper equipped with tactile sensing arrays, and a vision-based object pose tracking system. The robot is able to track the pose of an object while it is grasping it, and it can acquire grasp tactile imprints via pressure sensor arrays mounted on its gripper’s fingers. We consider models defined on several subspaces of our input data – using tactile perceptions or gripper poses only. Models are optimized and evaluated with f-fold cross-validation. Our preliminary results show that stability assessments based on both tactile and pose data can provide better rates than assessments based on tactile data alone.
grasp planning, grasp stability, Kernel Logistic Regression