Flexible group fairness metrics for survival analysis
Övrigt konferensbidrag, 2022

Algorithmic fairness is an increasingly important field concerned with detecting and mitigating biases in machine learning models. There has been a wealth of literature for algorithmic fairness in regression and classification however there has been little exploration of the field for survival analysis. Survival analysis is the prediction task in which one attempts to predict the probability of an event occurring over time. Survival predictions are particularly important in sensitive settings such as when utilising machine learning for diagnosis and prognosis of patients. In this paper we explore how to utilise existing survival metrics to measure bias with group fairness metrics. We explore this in an empirical experiment with 29 survival datasets and 8 measures. We find that measures of discrimination are able to capture bias well whereas there is less clarity with measures of calibration and scoring rules. We suggest further areas for research including prediction-based fairness metrics for distribution predictions.

Författare

Raphael Sonabend

Technische Universität Kaiserslautern

Imperial College London

University of Cambridge

Florian Pfisterer

Ludwig-Maximilians-Universität München (LMU)

Alan Mishler

J.P. Morgan Chase

Moritz Schauer

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Göteborgs universitet

Lukas Burk

Ludwig-Maximilians-Universität München (LMU)

Leibniz Institute

Sumantrak Mukherjee

Birla Institute of Technology and Science Pilani

Sebastian Vollmer

Technische Universität Kaiserslautern

DFKI

16th annual "Health Informatics meets Digital Health" DS Helath 2022
Vienna, Austria,

Ämneskategorier (SSIF 2025)

Sannolikhetsteori och statistik

Datavetenskap (datalogi)

Nationalekonomi

Mer information

Senast uppdaterat

2025-12-05