A Framework for User-Defined Body Gestures to Control a Humanoid Robot
Journal article, 2014

This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86100 % and an accuracy between 7396 %. We discuss the limitations of the system and present future work improvements.

INTERFACE

Robot

Robot navigation

Nao

Gesture

Humanoid robot

User-defined

User-defined gestures

Author

Mohammad Obaid

Chalmers, Applied Information Technology (Chalmers), Interaction design

F. Kistler

University of Augsburg

M. Haring

University of Augsburg

R. Buhling

University of Augsburg

E. Andre

University of Augsburg

International Journal of Social Robotics

1875-4791 (ISSN) 1875-4805 (eISSN)

Vol. 6 3 383-396

Subject Categories

Interaction Technologies

DOI

10.1007/s12369-014-0233-3

More information

Latest update

3/5/2018 8