Double Descent in Feature Selection: Revisiting LASSO and Basis Pursuit
Paper in proceeding, 2021

We present a novel analysis of feature selection in linear models by the convex framework of least absolute shrinkage operator (LASSO) and basis pursuit (BP). Our analysis pertains to a general overparametrized scenario. When the numbers of the features and the data samples grow proportionally, we obtain precise expressions for the asymptotic generalization error of LASSO and BP. Considering a mixture of strong and weak features, we provide insights into regularization trade-offs for double descent for l1 norm minimization. We validate these results with numerical experiments.

Overparameterization

LASSO

CGMT

Basis Pursuit

Author

David Bosch

Data Science and AI

Ashkan Panahi

Data Science and AI

Ayca Ozcelikkale

Uppsala University

Thirty-eighth International Conference on Machine Learning, ICML 2021

ICML 2021 OPPO - Workshop Overparameterization: Pitfalls & Opportunities
Virtual, ,

Areas of Advance

Information and Communication Technology

Subject Categories

Computational Mathematics

Signal Processing

Computer Science

More information

Latest update

10/23/2023