DoubleMatch: Improving Semi-Supervised Learning with Self-Supervision
Paper in proceeding, 2022

Following the success of supervised learning, semi-supervised learning (SSL) is now becoming increasingly popular. SSL is a family of methods, which in addition to a labeled training set, also use a sizable collection of unlabeled data for fitting a model. Most of the recent successful SSL methods are based on pseudo-labeling approaches: letting confident model predictions act as training labels. While these methods have shown impressive results on many benchmark datasets, a drawback of this approach is that not all unlabeled data are used during training. We propose a new SSL algorithm, DoubleMatch, which combines the pseudo-labeling technique with a self-supervised loss, enabling the model to utilize all unlabeled data in the training process. We show that this method achieves state-of-the-art accuracies on multiple benchmark datasets while also reducing training times compared to existing SSL methods. Code is available at https://github.com/walline/doublematch.

Author

Erik Wallin

Saab

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lennart Svensson

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Fredrik Kahl

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lars Hammarstrand

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Proceedings - International Conference on Pattern Recognition

10514651 (ISSN)

Vol. 2022-August 2871-2877
9781665490627 (ISBN)

26th International Conference on Pattern Recognition, ICPR 2022
Montreal, Canada,

Subject Categories

Signal Processing

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1109/ICPR56361.2022.9956182

More information

Latest update

1/3/2024 9