Numerical analysis of least squares and perceptron learning for classification problems
Artikel i vetenskaplig tidskrift, 2020

This work presents study on regularized and non-regularized versions of perceptron learning and least squares algorithms for classification problems. The Fréchet derivatives for least squares and perceptron algorithms are derived. Different Tikhonov’s regularization techniques for choosing the regularization parameter are discussed. Numerical experiments demonstrate performance of perceptron and least squares algorithms to classify simulated and experimental data sets.

perceptron learning algorithm

least squares algorithm

linear classifiers

Classification problem

Tikhonov’s regularization.

Författare

Larisa Beilina

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Open Journal of Discrete Applied Mathematics

2617-9679 (ISSN) 2617-9687 (eISSN)

Vol. 3 2 30-49

Ämneskategorier

Beräkningsmatematik

Reglerteknik

Signalbehandling

DOI

10.30538/psrp-odam2020.0035

Mer information

Skapat

2020-12-16