Evaluation of Out-of-Distribution Detection Performance on Autonomous Driving Datasets
Paper i proceeding, 2023

Safety measures need to be systemically investigated to what extent they evaluate the intended performance of Deep Neural Networks (DNNs) for critical applications. Due to a lack of verification methods for high-dimensional DNNs, a trade-off is needed between accepted performance and handling of out-of-distribution (OOD) samples.This work evaluates rejecting outputs from semantic segmentation DNNs by applying a Mahalanobis distance (MD) based on the most probable class-conditional Gaussian distribution for the predicted class as an OOD score. The evaluation follows three DNNs trained on the Cityscapes dataset and tested on four automotive datasets and finds that classification risk can drastically be reduced at the cost of pixel coverage, even when applied on unseen datasets. The applicability of our findings will support legitimizing safety measures and motivate their usage when arguing for safe usage of DNNs in automotive perception.

automotive safety

semantic segmentation

out-of-distribution detection

Författare

Jens Henriksson

Chalmers, Data- och informationsteknik, Interaktionsdesign och Software Engineering

Christian Berger

Chalmers, Data- och informationsteknik, Interaktionsdesign och Software Engineering

Stig Ursing

Semcon

Markus Borg

Codescene

2023 IEEE International Conference On Artificial Intelligence Testing (AITest)

Vol. 2023

2023 IEEE International Conference On Artificial Intelligence Testing (AITest)
Aten, Greece,

Ämneskategorier

Datavetenskap (datalogi)

Mer information

Skapat

2023-09-28