Numerical analysis of least squares and perceptron learning for classification problems
Journal article, 2020

This work presents study on regularized and non-regularized versions of perceptron learning and least squares algorithms for classification problems. The Fréchet derivatives for least squares and perceptron algorithms are derived. Different Tikhonov’s regularization techniques for choosing the regularization parameter are discussed. Numerical experiments demonstrate performance of perceptron and least squares algorithms to classify simulated and experimental data sets.

Tikhonov’s regularization.

least squares algorithm

Classification problem

perceptron learning algorithm

linear classifiers

Author

Larisa Beilina

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Open Journal of Discrete Applied Mathematics

2617-9679 (ISSN) 2617-9687 (eISSN)

Vol. 3 2 30-49

Subject Categories

Computational Mathematics

Control Engineering

Signal Processing

DOI

10.30538/psrp-odam2020.0035

More information

Latest update

2/18/2021