Predicting Missing Markers in Real-Time Optical Motion Capture
Journal article, 2009

A common problem in optical motion capture of human-body movement is the so-called missing marker problem. The occlusion of markers can lead to significant problems in tracking accuracy unless a continuous flow of data is guaranteed by interpolation or extrapolation algorithms. Since interpolation algorithms require data sampled before and after an occlusion, they cannot be used for real-time applications. Extrapolation algorithms only require data sampled before an occlusion. Other algorithms require statistical data and are designed for postprocessing. In order to bridge sampling gaps caused by occluded markers and hence to improve 3D real-time motion capture, we suggest a computationally cost-efficient extrapolation algorithm partly combined with a so-called constraint matrix. The realization of this prediction algorithm does not require statistical data nor does it rely on an underlying kinematic human model with pre-defined marker distances. Under the assumption that human motion can be linear, circular, or a linear combination of both, a prediction method is realized. The paper presents measurements of a circular movement wherein a marker is briefly lost. The suggested extrapolation method behaves well for a reasonable number of frames, not exceeding around two seconds of time.

nonlinear approximation

linear approximation

Extrapolation

Author

Tommaso Piazza

Chalmers, Applied Information Technology (Chalmers), Interaction Design (Chalmers)

Johan Lundström

Chalmers, Computer Science and Engineering (Chalmers)

Andreas Kunz

Swiss Federal Institute of Technology in Zürich (ETH)

Morten Fjeld

Chalmers, Applied Information Technology (Chalmers), Interaction Design (Chalmers)

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 5903 125-136

Subject Categories

Computer and Information Science

DOI

10.1007/978-3-642-10470-1_11

ISBN

3642104681

More information

Latest update

3/19/2018