Human Movement Direction Classification using Virtual Reality and Eye Tracking
Paper in proceeding, 2020

Collaborative robots are becoming increasingly more popular in industries, providing flexibility and increased productivity for complex tasks. However, the robots are not yet that interactive since they cannot yet interpret humans and adapt to their behaviour, mainly due to limited sensory input. Rapidly expanding research fields that could make collaborative robots smarter through an understanding of the operators intentions are; virtual reality, eye tracking, big data, and artificial intelligence. Prediction of human movement intentions could be one way to improve these robots. This can be broken down into the three stages, Stage One: Movement Direction Classification, Stage Two: Movement Phase Classification, and Stage Three: Movement Intention Prediction. This paper defines these stages and presents a solution to Stage One that shows that it is possible to collect gaze data and use that to classify a person’s movement direction. The next step is naturally to develop the remaining two stages.

virtual reality

movement classification

human intention prediction

collaborative manufacturing

eye tracking

Author

Julius Pettersson

Chalmers, Electrical Engineering, Systems and control

Petter Falkman

Chalmers, Electrical Engineering, Systems and control

Procedia Manufacturing

2351-9789 (eISSN)

Vol. 51 2020 95-102

30th International Conference on Flexible Automation and Intelligent Manufacturing, FAIM 2021
Athens, Greece,

Areas of Advance

Information and Communication Technology

Production

Subject Categories

Interaction Technologies

Human Computer Interaction

Robotics

DOI

10.1016/j.promfg.2020.10.015

More information

Latest update

5/7/2021 1