Evaluation of Out-of-Distribution Detection Performance on Autonomous Driving Datasets
Paper i proceeding, 2023

Safety measures need to be systemically investigated to what extent they evaluate the intended performance of Deep Neural Networks (DNNs) for critical applications. Due to a lack of verification methods for high-dimensional DNNs, a trade-off is needed between accepted performance and handling of out-of-distribution (OOD) samples.This work evaluates rejecting outputs from semantic segmentation DNNs by applying a Mahalanobis distance (MD) based on the most probable class-conditional Gaussian distribution for the predicted class as an OOD score. The evaluation follows three DNNs trained on the Cityscapes dataset and tested on four automotive datasets and finds that classification risk can drastically be reduced at the cost of pixel coverage, even when applied on unseen datasets. The applicability of our findings will support legitimizing safety measures and motivate their usage when arguing for safe usage of DNNs in automotive perception.

automotive safety

out-of-distribution detection

semantic segmentation

Författare

Jens Henriksson

Semcon

Christian Berger

Göteborgs universitet

Chalmers, Data- och informationsteknik, Interaktionsdesign och Software Engineering

Stig Ursing

Semcon

Markus Borg

Lunds universitet

2023 IEEE International Conference On Artificial Intelligence Testing (AITest)

Vol. 2023 74-81

2023 IEEE International Conference On Artificial Intelligence Testing (AITest)
Aten, Greece,

Ämneskategorier

Datavetenskap (datalogi)

DOI

10.1109/AITest58265.2023.00021

Mer information

Senast uppdaterat

2024-09-26