Dexterous robot assistant for everyday physical object manipulation
Research Project, 2020
–
Although an abundance of basic research has been done in terms of sensor modeling, planning and motion control, systems that demonstrate advanced grasping and manipulation capabilities similar to humans in realistic environments remains a major challenge. Existing systems demonstrate severe limitations in terms of dealing with novelty, uncertainty and unforeseen situations. A key technical difficulty is how to encode, reason about and overcome the uncertainties that are inherent in both robotic perception and also in the robot's physical interactions with objects and surfaces. This project focuses on designing of efficient representations allowing high-level planning for complex grasping and manipulation scenarios involving various types of objects of different sizes, shapes or materials. In addition, inspired by human studies showing efficient use of multi-modal sensing, data coming from different sources, such as vision and tactile sensing is leveraged. We study how to bridge the gap between analytical and data-driven approaches in a principled way to achieve robustness, high success rates, and computational and data efficiency using real sensory data.
Participants
Yasemin Bekiroglu (contact)
Chalmers, Electrical Engineering, Systems and control
Funding
Chalmers AI Research Centre (CHAIR)
Funding Chalmers participation during 2021–2023