Augmented Reality interface to verify Robot Learning
Paper i proceeding, 2020

Teaching robots new skills is considered as an important aspect of Human-Robot Collaboration (HRC). One challenge is that robots cannot communicate feedback in the same ways as humans do. This decreases the trust towards robots since it is difficult to judge, before the actual execution, if the robot has learned the task correctly. In this paper, we introduce an Augmented Reality (AR) based visualization tool that allows humans to verify the taught behavior before its execution. Our verification interface displays a virtual simulation embedded into the real environment, timely coupled with a semantic description of the current action. We developed three designs based on different interface/visualization-technology combinations to explore the potential benefits of enhanced simulations using AR over traditional simulation environments like RViz. We conducted a user study with 18 participants to assess the effectiveness of the proposed visualization tools regarding error detection capabilities. One of the advantages of the AR interfaces is that they provide more realistic feedback than traditional simulations with a lower cost of not having to model the entire environment.


Maximilian Diehl

Chalmers, Elektroteknik, System- och reglerteknik, Mekatronik

Alexander Plopski

University of Otago

Hirokazu Kato

Nara Institute of Science and Technology

Karinne Ramirez-Amaro

Nara Institute of Science and Technology

29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020

378-383 9223502

29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020
Virtual, Naples, Italy,

Learning & Understanding Human-Centered Robotic Manipulation Strategies

Chalmers AI-forskningscentrum (CHAIR), 2020-01-13 -- 2025-01-14.



Människa-datorinteraktion (interaktionsdesign)

Robotteknik och automation



Mer information

Senast uppdaterat