Dyglip: A dynamic graph model with link prediction for accurate multi-camera multiple object tracking
Paper in proceeding, 2021

Multi-Camera Multiple Object Tracking (MC-MOT) is a significant computer vision problem due to its emerging applicability in several real-world applications. Despite a large number of existing works, solving the data association problem in any MC-MOT pipeline is arguably one of the most challenging tasks. Developing a robust MC-MOT system, however, is still highly challenging due to many practical issues such as inconsistent lighting conditions, varying object movement patterns, or the trajectory occlusions of the objects between the cameras. To address these problems, this work, therefore, proposes a new Dynamic Graph Model with Link Prediction (DyGLIP) approach to solve the data association task. Compared to existing methods, our new model offers several advantages, including better feature representations and the ability to recover from lost tracks during camera transitions. Moreover, our model works gracefully regardless of the overlapping ratios between the cameras. Experimental results show that we outperform existing MC-MOT algorithms by a large margin on several practical datasets. Notably, our model works favorably on online settings but can be extended to an incremental approach for large-scale datasets.

Author

Kha Gia Quach

Concordia University

Pha Nguyen

VinAI

Huu Le

Computer vision and medical image analysis

Thanh Dat Truong

University of Arkansas

Chi Nhan Duong

Concordia University

M.-T. Tran

Vietnam National University

Khoa Luu

University of Arkansas

Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

10636919 (ISSN)

13779-13788
9781665445092 (ISBN)

2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021
Virtual, Online, USA,

Subject Categories

Robotics

Computer Science

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1109/CVPR46437.2021.01357

More information

Latest update

4/27/2022