Double Descent in Feature Selection: Revisiting LASSO and Basis Pursuit
Paper i proceeding, 2021

We present a novel analysis of feature selection in linear models by the convex framework of least absolute shrinkage operator (LASSO) and basis pursuit (BP). Our analysis pertains to a general overparametrized scenario. When the numbers of the features and the data samples grow proportionally, we obtain precise expressions for the asymptotic generalization error of LASSO and BP. Considering a mixture of strong and weak features, we provide insights into regularization trade-offs for double descent for l1 norm minimization. We validate these results with numerical experiments.

Overparameterization

LASSO

CGMT

Basis Pursuit

Författare

David Bosch

Data Science och AI

Ashkan Panahi

Data Science och AI

Ayca Ozcelikkale

Uppsala universitet

Thirty-eighth International Conference on Machine Learning, ICML 2021

ICML 2021 OPPO - Workshop Overparameterization: Pitfalls & Opportunities
Virtual, ,

Styrkeområden

Informations- och kommunikationsteknik

Ämneskategorier

Beräkningsmatematik

Signalbehandling

Datavetenskap (datalogi)

Mer information

Senast uppdaterat

2023-10-23