Simultaneous Tactile Exploration and Grasp Refinement for Unknown Objects
Journal article, 2021

This letter addresses the problem of simultaneously exploring an unknown object to model its shape, using tactile sensors on robotic fingers, while also improving finger placement to optimise grasp stability. In many situations, a robot will have only a partial camera view of the near side of an observed object, for which the far side remains occluded. We show how an initial grasp attempt, based on an initial guess of the overall object shape, yields tactile glances of the far side of the object which enable the shape estimate and consequently the successive grasps to be improved. We propose a grasp exploration approach using a probabilistic representation of shape, based on Gaussian Process Implicit Surfaces. This representation enables initial partial vision data to be augmented with additional data from successive tactile glances. This is combined with a probabilistic estimate of grasp quality to refine grasp configurations. When choosing the next set of finger placements, a bi-objective optimisation method is used to mutually maximise grasp quality and improve shape representation during successive grasp attempts. Experimental results show that the proposed approach yields stable grasp configurations more efficiently than a baseline method, while also yielding improved shape estimate of the grasped object.

Force and tactile sensing

perception for grasping and manipulation

grasping

Author

Cristiana De Farias

University of Birmingham

Naresh Marturi

University of Birmingham

Rustam Stolkin

University of Birmingham

Yasemin Bekiroglu

Chalmers, Electrical Engineering, Systems and control

University College London (UCL)

IEEE Robotics and Automation Letters

23773766 (eISSN)

Vol. 6 2 3349-3356 9366782

Subject Categories

Aerospace Engineering

Robotics

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1109/LRA.2021.3063074

More information

Latest update

5/4/2021 9