Comparison of LSTM, Transformers, and MLP-mixer neural networks for gaze based human intention prediction
Journal article, 2023

Collaborative robots have gained popularity in industries, providing flexibility and increased productivity for complex tasks. However, their ability to interact with humans and adapt to their behavior is still limited. Prediction of human movement intentions is one way to improve the robots adaptation. This paper investigates the performance of using Transformers and MLP-Mixer based neural networks to predict the intended human arm movement direction, based on gaze data obtained in a virtual reality environment, and compares the results to using an LSTM network. The comparison will evaluate the networks based on accuracy on several metrics, time ahead of movement completion, and execution time. It is shown in the paper that there exists several network configurations and architectures that achieve comparable accuracy scores. The best performing Transformers encoder presented in this paper achieved an accuracy of 82.74%, for predictions with high certainty, on continuous data and correctly classifies 80.06% of the movements at least once. The movements are, in 99% of the cases, correctly predicted the first time, before the hand reaches the target and more than 19% ahead of movement completion in 75% of the cases. The results shows that there are multiple ways to utilize neural networks to perform gaze based arm movement intention prediction and it is a promising step toward enabling efficient human-robot collaboration.

human intention prediction

eye tracking

Transformers

time series prediction

collaborative robots

Author

Julius Pettersson

Chalmers, Electrical Engineering, Systems and control

Petter Falkman

Chalmers, Electrical Engineering, Systems and control

Frontiers in Neurorobotics

16625218 (eISSN)

Vol. 17 1157957

Subject Categories

Computer Science

Computer Vision and Robotics (Autonomous Systems)

DOI

10.3389/fnbot.2023.1157957

PubMed

37304663

More information

Latest update

6/21/2023