How would you gesture navigate a drone? A user-centered approach to control a drone
Paper in proceeding, 2016

Gestural interaction with flying drones is now on the rise; however, little work has been done to reveal the gestural preferences from users directly. In this paper, we present an elicitation study to help in realizing user-defined gestures for drone navigation. We apply a user-centered approach in which we collected data from 25 participants performing gestural interactions for twelve drone actions of which ten are navigational actions. The analyses of 300 gesture data collected from our participants reveal a user-defined gestural set of possible suitable gestures to control a drone. We report results that can be used by software developers, engineers or designers; and included a taxonomy for the set of user-defined gestures, gestural agreement scores, time performances and subjective ratings for each action. Finally, we discuss the gestural set with implementation insights and conclude with future directions.

Drone

Quadcopter

User-defined

Gesture

Study

Interaction

Author

Mohammad Obaid

Uppsala University

F. Kistler

University of Augsburg

Gabriele Kasparaviciute

Chalmers, Applied Information Technology (Chalmers), Interaction design

Asim Evren Yantaç

Koç University

Morten Fjeld

Chalmers, Applied Information Technology (Chalmers), Interaction design

AcademicMindtrek 2016 - Proceedings of the 20th International Academic Mindtrek Conference

113-121
978-145034367-1 (ISBN)

Subject Categories

Human Computer Interaction

DOI

10.1145/2994310.2994348

ISBN

978-145034367-1

More information

Latest update

3/5/2018 8