Data-efficient learning for mobile manipulation tasks using multi-modal data
Forskningsprojekt, 2026
– 2031
This project investigates data-efficient learning methods for autonomous mobile manipulation in unstructured, real-world environments using multimodal sensory data such as vision, touch, force, and proprioception. It aims to enable mobile robots to safely and robustly perform complex manipulation tasks while adapting to uncertainty, novel objects, and changing environments. The work bridges analytical modeling and data-driven learning to improve generalization, interpretability, and sample efficiency, with a focus on contact-rich tasks. Through integrated perception, planning, and control, the project advances continuous skill learning for mobile manipulators operating beyond structured industrial settings.
Deltagare
Yasemin Bekiroglu (kontakt)
Chalmers, Elektroteknik, System- och reglerteknik
Francesco Gigante
Chalmers, Elektroteknik, System- och reglerteknik
Nikolce Murgovski
Chalmers, Elektroteknik
Changfu Zou
Chalmers, Elektroteknik, System- och reglerteknik
Finansiering
Wallenberg AI, Autonomous Systems and Software Program
Finansierar Chalmers deltagande under 2026–2031