Simultaneous Tactile Exploration and Grasp Refinement for Unknown Objects
Artikel i vetenskaplig tidskrift, 2021
This paper addresses the problem of simultaneously exploring the shape of a partially unknown object, using tactile sensors on robotic fingers, while also improving finger placement to optimise grasp stability. In many situations, a robot will have only a partial camera view of the near side of an observed object, for which the far side remains occluded. We show how an initial grasp attempt, based on an initial guess of the overall object shape, yields tactile glances of the far side of the object which enable the shape estimate and consequently the successive grasps to be improved. We propose a grasp exploration approach using a probabilistic representation of shape, based on Gaussian Process Implicit Surfaces. This representation enables initial partial vision data to be augmented with additional data from successive tactile glances. This is combined with a probabilistic estimate of grasp quality to refine grasp configurations. When choosing the next set of finger placements, a bi-objective optimisation method is used to mutually reduce uncertainty and maximise grasp quality during successive grasp attempts. Experimental results show that the proposed approach yields stable grasp configurations more efficiently than baseline methods, while also yielding improved estimates of the shapes of grasped objects.
force and tactile sensing
Gaussian Processes
Bayesian Optimisation
perception for grasping and manipulation
Grasping