A cross-season correspondence dataset for robust semantic segmentation
Paper i proceeding, 2019

In this paper, we present a method to utilize 2D-2D point matches between images taken during different image conditions to train a convolutional neural network for semantic segmentation. Enforcing label consistency across the matches makes the final segmentation algorithm robust to seasonal changes. We describe how these 2D-2D matches can be generated with little human interaction by geometrically matching points from 3D models built from images. Two cross-season correspondence datasets are created providing 2D-2D matches across seasonal changes as well as from day to night. The datasets are made publicly available to facilitate further research. We show that adding the correspondences as extra supervision during training improves the segmentation performance of the convolutional neural network, making it more robust to seasonal changes and weather conditions.

Deep Learning

Segmentation

Datasets and Evaluation

Grouping and Shape

Författare

Måns Larsson

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Erik Stenborg

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Zenuity AB

Lars Hammarstrand

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Marc Pollefeys

Microsoft

Eidgenössische Technische Hochschule Zürich (ETH)

Torsten Sattler

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Fredrik Kahl

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

10636919 (ISSN)

Vol. 2019-June 9524-9534 8953253

32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019
Long Beach, USA,

COPPLAR CampusShuttle cooperative perception & planning platform

VINNOVA (2015-04849), 2016-01-01 -- 2018-12-31.

Ämneskategorier

Bioinformatik (beräkningsbiologi)

Datorseende och robotik (autonoma system)

Medicinsk bildbehandling

DOI

10.1109/CVPR.2019.00976

Mer information

Senast uppdaterat

2020-04-23