Sharing Pattern Submodels for Prediction with Missing Values
Paper in proceeding, 2023

Missing values are unavoidable in many applications of machine learning and present challenges both during training and at test time. When variables are missing in recurring patterns, fitting separate pattern submodels have been proposed as a solution. However, fitting models independently does not make efficient use of all available data. Conversely, fitting a single shared model to the full data set relies on imputation which often leads to biased results when missingness depends on unobserved factors. We propose an alternative approach, called sharing pattern submodels, which i) makes predictions that are robust to missing values at test time, ii) maintains or improves the predictive power of pattern submodels, and iii) has a short description, enabling improved interpretability. Parameter sharing is enforced through sparsity-inducing regularization which we prove leads to consistent estimation. Finally, we give conditions for when a sharing model is optimal, even when both missingness and the target outcome depend on unobserved variables. Classification and regression experiments on synthetic and real-world data sets demonstrate that our models achieve a favorable tradeoff between pattern specialization and information sharing.

Author

Lena Stempfle

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Ashkan Panahi

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Fredrik Johansson

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Proceedings of the AAAI Conference on Artificial Intelligence

21595399 (ISSN) 23743468 (eISSN)

Vol. 37 8 9882-9890
9781577358800 (ISBN)

37th AAAI Conference on Artificial Intelligence, AAAI 2023
Washington, USA,

Subject Categories

Probability Theory and Statistics

DOI

10.1609/aaai.v37i8.26179

More information

Latest update

10/1/2024