Exploiting Riemannian Manifolds for Daily Activity Classification in Video Towards Health Care
Paper i proceeding, 2016

This paper addresses the problem of classifying activities of daily living in video. The proposed method uses a tree structure of two layers, where in each node of the tree there resides a Riemannian manifold that corresponds to different part-based covariance features. In the first layer, activities are classified according to the dynamics of upper body parts. In the second layer, activities are further classified according to the appearance of local image patches at hands in key frames, where the interacting objects are likely to be attached. The novelties of this paper include: (i) characterizing the motion of upper body parts by a covariance matrix of distances between each pair of key points and the orientations of lines that connect them; (ii) describing human-object interaction by the appearance of local regions around hands in key frames that are selected based on the proximity of hands to other key points; (iii) formulating a pairwise geodesics-based kernel for activity classification on Riemannian manifolds under the log-Euclidean metric. Experiments were conducted on a video dataset containing a total number of 426 video events (activities) from 4 classes. The proposed method is shown to be effective by achieving high classification accuracy (93.79% on average) and small false alarms (1.99% on average) overall, as well as for each individual class.

Författare

Yixiao Yun

Chalmers, Signaler och system, Signalbehandling och medicinsk teknik

Irene Yu-Hua Gu

Chalmers, Signaler och system, Signalbehandling och medicinsk teknik

IEEE International Conference on E-health Networking, Application & Services (HealthCom 2016), Munich, Germany, Sept. 14-17, 2016.

363-368
978-1-5090-3370-6 (ISBN)

Styrkeområden

Informations- och kommunikationsteknik

Transport

Ämneskategorier

Robotteknik och automation

Signalbehandling

Datorseende och robotik (autonoma system)

DOI

10.1109/HealthCom.2016.7749487

ISBN

978-1-5090-3370-6

Mer information

Skapat

2017-10-07