Human Movement Direction Classification using Virtual Reality and Eye Tracking
Paper i proceeding, 2020

Collaborative robots are becoming increasingly more popular in industries, providing flexibility and increased productivity for complex tasks. However, the robots are not yet that interactive since they cannot yet interpret humans and adapt to their behaviour, mainly due to limited sensory input. Rapidly expanding research fields that could make collaborative robots smarter through an understanding of the operators intentions are; virtual reality, eye tracking, big data, and artificial intelligence. Prediction of human movement intentions could be one way to improve these robots. This can be broken down into the three stages, Stage One: Movement Direction Classification, Stage Two: Movement Phase Classification, and Stage Three: Movement Intention Prediction. This paper defines these stages and presents a solution to Stage One that shows that it is possible to collect gaze data and use that to classify a person’s movement direction. The next step is naturally to develop the remaining two stages.

virtual reality

movement classification

human intention prediction

collaborative manufacturing

eye tracking

Författare

Julius Pettersson

Chalmers, Elektroteknik, System- och reglerteknik

Petter Falkman

Chalmers, Elektroteknik, System- och reglerteknik

Procedia Manufacturing

2351-9789 (eISSN)

Vol. 51 2020 95-102

30th International Conference on Flexible Automation and Intelligent Manufacturing, FAIM 2021
Athens, Greece,

Styrkeområden

Informations- och kommunikationsteknik

Produktion

Ämneskategorier

Interaktionsteknik

Människa-datorinteraktion (interaktionsdesign)

Robotteknik och automation

DOI

10.1016/j.promfg.2020.10.015

Mer information

Senast uppdaterat

2021-05-07