Enhancing Visual Perception of Shape through Tactile Glances
Paper in proceeding, 2013

Object shape information is an important parameter in robot grasping tasks. However, it may be difficult to obtain accurate models of novel objects due to incomplete and noisy sensory measurements. In addition, object shape may change due to frequent interaction with the object (cereal boxes, etc). In this paper, we present a probabilistic approach for learning object models based on visual and tactile perception through physical interaction with an object. Our robot explores unknown objects by touching them strategically at parts that are uncertain in terms of shape. The robot starts by using only visual features to form an initial hypothesis about the object shape, then gradually adds tactile measurements to refine the object model. Our experiments involve ten objects of varying shapes and sizes in a real setup. The results show that our method is capable of choosing a small number of touches to construct object models similar to real object shapes and to determine similarities among acquired models.

Visualization

Stereo vision

Three-dimensional displays

Cameras

Shape

Robot sensing systems

Author

Mårten Björkman

Royal Institute of Technology (KTH)

Yasemin Bekiroglu

Royal Institute of Technology (KTH)

Virgile Högman

Royal Institute of Technology (KTH)

Danica Kragic

Royal Institute of Technology (KTH)

IEEE/RSJ International Conference on Intelligent Robots and Systems

2153-0858 (ISSN) 2153-0866 (eISSN)

3180-3186

IEEE/RSJ International Conference on Intelligent Robots and Systems
Tokyo, Japan,

Subject Categories

Other Computer and Information Science

Robotics

Computer Vision and Robotics (Autonomous Systems)

More information

Latest update

3/9/2022 8