Semi-supervised learning with self-supervision for closed and open sets
Licentiatavhandling, 2023

Semi-supervised learning (SSL) is a learning framework that enables the use of unlabeled data with labeled data. These methods play a crucial role in reducing the burden of human labeling in training deep learning models. Many methods for SSL learn from unlabeled data through confidence-based pseudo-labeling. This technique involves assigning artificial labels to unlabeled data based on model predictions, given that these predictions exceed a confidence threshold. A drawback of this approach is that large parts of data may be ignored. This work proposes a self-supervised component for these frameworks to enable learning from all unlabeled data. The proposed self-supervision involves aligning feature predictions across weak and strong data augmentations for each unlabeled sample. We show that this approach, DoubleMatch, leads to improved training speed and accuracy on many benchmark datasets.

SSL is often studied in the closed-set scenario, where we assume that unlabeled data only contain classes present in the labeled data. More realistically, there is a risk that unlabeled data contain unseen classes, corrupted data, or outliers in other forms. This setting is referred to as open-set semi-supervised learning (OSSL). Many existing methods for OSSL use a procedure that involves selecting samples from unlabeled data that likely belong to the known classes, for inclusion in a traditional SSL objective. This work proposes an alternative approach, SeFOSS, that utilizes all unlabeled data through the inclusion of the self-supervised component proposed by DoubleMatch. Additionally, SeFOSS uses an energy-based method for classifying data as in-distribution (ID) or out-of-distribution (OOD). Experimental evaluation shows that SeFOSS achieves strong results for both closed-set accuracy and OOD detection in many open-set scenarios. Additionally, our results indicate that traditional methods for (closed-set) SSL may perform better in the open-set scenario than what has been previously suggested by other works.

Furthermore, this work proposes another method for OSSL: the Beta-model. This method proposes a novel score for ID/OOD classification and introduces the use of the expectation-maximization algorithm in OSSL, for estimating conditional distributions of scores given ID or OOD data. This method demonstrates state-of-the-art results on many benchmark problems for OSSL.

Semi-supervised learning

classification

deep learning

open-set semi-supervised learning

SB-H3, Sven Hultins gata 6
Opponent: Assoc. Prof. Juho Kannala, Aalto University, Finland

Författare

Erik Wallin

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

DoubleMatch: Improving Semi-Supervised Learning with Self-Supervision

Proceedings - International Conference on Pattern Recognition,;Vol. 2022-August(2022)p. 2871-2877

Paper i proceeding

Improving Open-Set Semi-Supervised Learning with Self-Supervision

Proceedings - 2024 IEEE Winter Conference on Applications of Computer Vision, WACV 2024,;(2024)p. 2345-2354

Paper i proceeding

Wallin, E., Svensson, L., Kahl, F., Hammarstrand, L. Beta-model: Open-Set Semi-Supervised Learning with In-Distribution Subspaces

Robust och precis semi-övervakad inlärning

Wallenberg AI, Autonomous Systems and Software Program, 2020-08-25 -- 2024-08-23.

Infrastruktur

C3SE (Chalmers Centre for Computational Science and Engineering)

Ämneskategorier

Signalbehandling

Datavetenskap (datalogi)

Datorseende och robotik (autonoma system)

Utgivare

Chalmers

SB-H3, Sven Hultins gata 6

Online

Opponent: Assoc. Prof. Juho Kannala, Aalto University, Finland

Mer information

Senast uppdaterat

2024-07-02