Human Movement Direction Prediction using Virtual Reality and Eye Tracking
Paper in proceeding, 2021

One way of potentially improving the use of robots in a collaborative environment is through prediction of human intention that would give the robots insight into how the operators are about to behave. An important part of human behaviour is arm movement and this paper presents a method to predict arm movement based on the operator’s eye gaze. A test scenario has been designed in order to gather coordinate based hand movement data in a virtual reality environment. The results shows that the eye gaze data can successfully be used to train an artificial neural network that is able to predict the direction of movement ~500ms ahead of time.

Virtual reality (VR)

collaborative robots

human intention prediction

movement prediction

eye tracking

Author

Julius Pettersson

Chalmers, Electrical Engineering, Systems and control, Automation

Petter Falkman

Chalmers, Electrical Engineering, Systems and control, Automation

IEEE International Conference on Industrial Technology (ICIT)

Vol. 2021-March 889-894 9453581

2021 22nd IEEE International Conference on Industrial Technology (ICIT)
Valencia, Spain,

Areas of Advance

Production

Subject Categories

Interaction Technologies

Human Computer Interaction

Robotics

DOI

10.1109/ICIT46573.2021.9453581

More information

Latest update

11/17/2021