Regularised Weights in Statistical Models
Licentiate thesis, 2021

For flexible and overparameterised models like neural networks, overfitting can be a notorious problem that makes it hard to give accurate predictions in real-life usage. Overfitting is in particular likely in the presence of errors in the training data, such as misclassifications or outliers. Therefore it is essential to either carefully inspect the data or, more realistically, adapt the training algorithm to reduce variance and overfitting.

To reduce the risk of overfitting, a common approach is to manipulate the loss function. Either by adding a penalty on the model's flexibility, which reduces variance with a cost of an increased bias, or weight the loss contribution from different data points in order to reduce the influence of harmful data.

This thesis introduces a self-maintained method to reweigh different components (observations and/or parameter regularisation) in the loss function during training. With some care with the choice of model, these weights can be solved for, leading in the end to only a modification in the loss function. Due to this, the resulting method can easily be combined with other regularisation techniques.

Using the weighting technique on observations in a setting with mislabeled data produces more robust training than an unweighted model and detects mislabeled examples in data.

When used on the regularisation penalty, the weights reduces bias introduces by the regularisation term while keeping some crucial attributes from the original penalty.

Lasso

Robust Statistics

Deep Learning

Weighted loss.

Noisy Labels

Neural Networks

Regularisation

Pascal (Zoom: https://chalmers.zoom.us/j/63940246794 Password: 199493)
Opponent: Jonas Wallin, Department of Statistics, Lund University

Author

Olof Zetterqvist

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Zetterqvist, O., Jörnsten, R., Jonasson, J. Robust Neural Network Classification via Double Regularization.

Zetterqvist, O., Jonasson, J. Entropy weighted regularisation, a general way to debias regularisation penalties.

Infrastructure

C3SE (Chalmers Centre for Computational Science and Engineering)

Subject Categories

Probability Theory and Statistics

Publisher

Chalmers

Pascal (Zoom: https://chalmers.zoom.us/j/63940246794 Password: 199493)

Online

Opponent: Jonas Wallin, Department of Statistics, Lund University

More information

Latest update

12/9/2021