Flexible, non-parametric modeling using regularized neural networks
Artikel i vetenskaplig tidskrift, 2022

Non-parametric, additive models are able to capture complex data dependencies in a flexible, yet interpretable way. However, choosing the format of the additive components often requires non-trivial data exploration. Here, as an alternative, we propose PrAda-net, a one-hidden-layer neural network, trained with proximal gradient descent and adaptive lasso. PrAda-net automatically adjusts the size and architecture of the neural network to reflect the complexity and structure of the data. The compact network obtained by PrAda-net can be translated to additive model components, making it suitable for non-parametric statistical modelling with automatic model selection. We demonstrate PrAda-net on simulated data, where we compare the test error performance, variable importance and variable subset identification properties of PrAda-net to other lasso-based regularization approaches for neural networks. We also apply PrAda-net to the massive U.K. black smoke data set, to demonstrate how PrAda-net can be used to model complex and heterogeneous data with spatial and temporal components. In contrast to classical, statistical non-parametric approaches, PrAda-net requires no preliminary modeling to select the functional forms of the additive components, yet still results in an interpretable model representation.

Additive models

Neural networks

Regularization

Adaptive lasso

Model selection

Non-parametric regression

Författare

Oskar Allerbo

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Göteborgs universitet

Rebecka Jörnsten

Göteborgs universitet

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Computational Statistics

0943-4062 (ISSN) 16139658 (eISSN)

Vol. 37 4 2029-2047

Ämneskategorier

Matematik

DOI

10.1007/s00180-021-01190-4

Mer information

Senast uppdaterat

2024-03-07