Improving Open-Set Semi-Supervised Learning with Self-Supervision
Paper in proceeding, 2024

Open-set semi-supervised learning (OSSL) embodies a practical scenario within semi-supervised learning, wherein the unlabeled training set encompasses classes absent from the labeled set. Many existing OSSL methods assume that these out-of-distribution data are harmful and put effort into excluding data belonging to unknown classes from the training objective. In contrast, we propose an OSSL framework that facilitates learning from all unlabeled data through self-supervision. Additionally, we utilize an energy-based score to accurately recognize data belonging to the known classes, making our method well-suited for handling uncurated data in deployment. We show through extensive experimental evaluations that our method yields state-of-the-art results on many of the evaluated benchmark problems in terms of closed-set accuracy and open-set recognition when compared with existing methods for OSSL. Our code is available at https://github.com/walline/ssl-tf2-sefoss.

Algorithms

and algorithms

formulations

Machine learning architectures

Author

Erik Wallin

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Saab

Lennart Svensson

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Fredrik Kahl

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lars Hammarstrand

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Proceedings - 2024 IEEE Winter Conference on Applications of Computer Vision, WACV 2024

2345-2354
9798350318920 (ISBN)

2024 IEEE Winter Conference on Applications of Computer Vision, WACV 2024
Waikoloa, USA,

Subject Categories

Computer Science

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1109/WACV57701.2024.00235

More information

Latest update

7/2/2024 3