Analysis of Interpolating Regression Models and the Double Descent Phenomenon
Paper i proceeding, 2023

A regression model with more parameters than data points in the training data is overparametrized and has the capability to interpolate the training data. Based on the classical bias-variance tradeoff expressions, It is commonly assumed that models which interpolate noisy training data are poor to generalize. In some cases, this is not true. The best models obtained are overparametrized and the testing error exhibits the double descent behavior as the model order increases. In this contribution, we provide some analysis to explain the double descent phenomenon, first reported in the machine learning literature. We focus on interpolating models derived from the minimum norm solution to the classical least-squares problem and also briefly discuss model fitting using ridge regression. We derive a result based on the behavior of the smallest singular value of the regression matrix that explains the peak location and the double descent shape of the testing error as a function of model order.

least-squares

machine learning

non-linear regression

matrix analysis

Författare

Tomas McKelvey

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

IFAC-PapersOnLine

24058971 (ISSN) 24058963 (eISSN)

Vol. 56 2 5869-5874
9781713872344 (ISBN)

22nd IFAC World Congress
Yokohama, Japan,

Ämneskategorier

Beräkningsmatematik

Reglerteknik

Datorseende och robotik (autonoma system)

DOI

10.1016/j.ifacol.2023.10.084

Mer information

Senast uppdaterat

2024-02-23