Data-efficient learning for mobile manipulation tasks using multi-modal data
Research Project, 2026 – 2031

This project investigates data-efficient learning methods for autonomous mobile manipulation in unstructured, real-world environments using multimodal sensory data such as vision, touch, force, and proprioception. It aims to enable mobile robots to safely and robustly perform complex manipulation tasks while adapting to uncertainty, novel objects, and changing environments. The work bridges analytical modeling and data-driven learning to improve generalization, interpretability, and sample efficiency, with a focus on contact-rich tasks. Through integrated perception, planning, and control, the project advances continuous skill learning for mobile manipulators operating beyond structured industrial settings.

Participants

Yasemin Bekiroglu (contact)

Chalmers, Electrical Engineering, Systems and control

Francesco Gigante

Chalmers, Electrical Engineering, Systems and control

Nikolce Murgovski

Chalmers, Electrical Engineering

Changfu Zou

Chalmers, Electrical Engineering, Systems and control

Funding

Wallenberg AI, Autonomous Systems and Software Program

Funding Chalmers participation during 2026–2031

More information

Latest update

1/26/2026