Flexible group fairness metrics for survival analysis
Other conference contribution, 2022

Algorithmic fairness is an increasingly important field concerned with detecting and mitigating biases in machine learning models. There has been a wealth of literature for algorithmic fairness in regression and classification however there has been little exploration of the field for survival analysis. Survival analysis is the prediction task in which one attempts to predict the probability of an event occurring over time. Survival predictions are particularly important in sensitive settings such as when utilising machine learning for diagnosis and prognosis of patients. In this paper we explore how to utilise existing survival metrics to measure bias with group fairness metrics. We explore this in an empirical experiment with 29 survival datasets and 8 measures. We find that measures of discrimination are able to capture bias well whereas there is less clarity with measures of calibration and scoring rules. We suggest further areas for research including prediction-based fairness metrics for distribution predictions.

Author

Raphael Sonabend

Technische Universität Kaiserslautern

Imperial College London

University of Cambridge

Florian Pfisterer

Ludwig Maximilian University of Munich (LMU)

Alan Mishler

J.P. Morgan Chase

Moritz Schauer

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

University of Gothenburg

Lukas Burk

Ludwig Maximilian University of Munich (LMU)

Leibniz Institute

Sumantrak Mukherjee

Birla Institute of Technology and Science Pilani

Sebastian Vollmer

Technische Universität Kaiserslautern

DFKI

16th annual "Health Informatics meets Digital Health" DS Helath 2022
Vienna, Austria,

Subject Categories (SSIF 2025)

Probability Theory and Statistics

Computer Sciences

Economics

More information

Latest update

12/5/2025