Enhancing Visual Perception of Shape through Tactile Glances
Paper i proceeding, 2013

Object shape information is an important parameter in robot grasping tasks. However, it may be difficult to obtain accurate models of novel objects due to incomplete and noisy sensory measurements. In addition, object shape may change due to frequent interaction with the object (cereal boxes, etc). In this paper, we present a probabilistic approach for learning object models based on visual and tactile perception through physical interaction with an object. Our robot explores unknown objects by touching them strategically at parts that are uncertain in terms of shape. The robot starts by using only visual features to form an initial hypothesis about the object shape, then gradually adds tactile measurements to refine the object model. Our experiments involve ten objects of varying shapes and sizes in a real setup. The results show that our method is capable of choosing a small number of touches to construct object models similar to real object shapes and to determine similarities among acquired models.

Visualization

Stereo vision

Three-dimensional displays

Cameras

Shape

Robot sensing systems

Författare

Mårten Björkman

Kungliga Tekniska Högskolan (KTH)

Yasemin Bekiroglu

Kungliga Tekniska Högskolan (KTH)

Virgile Högman

Kungliga Tekniska Högskolan (KTH)

Danica Kragic

Kungliga Tekniska Högskolan (KTH)

IEEE/RSJ International Conference on Intelligent Robots and Systems

2153-0858 (ISSN) 2153-0866 (eISSN)

3180-3186

IEEE/RSJ International Conference on Intelligent Robots and Systems
Tokyo, Japan,

Ämneskategorier

Annan data- och informationsvetenskap

Robotteknik och automation

Datorseende och robotik (autonoma system)

Mer information

Senast uppdaterat

2022-03-09