Improving Open-Set Semi-Supervised Learning with Self-Supervision
Paper i proceeding, 2024

Open-set semi-supervised learning (OSSL) embodies a practical scenario within semi-supervised learning, wherein the unlabeled training set encompasses classes absent from the labeled set. Many existing OSSL methods assume that these out-of-distribution data are harmful and put effort into excluding data belonging to unknown classes from the training objective. In contrast, we propose an OSSL framework that facilitates learning from all unlabeled data through self-supervision. Additionally, we utilize an energy-based score to accurately recognize data belonging to the known classes, making our method well-suited for handling uncurated data in deployment. We show through extensive experimental evaluations that our method yields state-of-the-art results on many of the evaluated benchmark problems in terms of closed-set accuracy and open-set recognition when compared with existing methods for OSSL. Our code is available at https://github.com/walline/ssl-tf2-sefoss.

Algorithms

and algorithms

formulations

Machine learning architectures

Författare

Erik Wallin

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Saab

Lennart Svensson

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Fredrik Kahl

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Lars Hammarstrand

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Proceedings - 2024 IEEE Winter Conference on Applications of Computer Vision, WACV 2024

2345-2354
9798350318920 (ISBN)

2024 IEEE Winter Conference on Applications of Computer Vision, WACV 2024
Waikoloa, USA,

Ämneskategorier

Datavetenskap (datalogi)

Datorseende och robotik (autonoma system)

DOI

10.1109/WACV57701.2024.00235

Mer information

Senast uppdaterat

2024-07-02