Kernel Subspace Learning for Pattern Classification
Kapitel i bok, 2018

Kernel methods are nonparametric feature extraction techniques that attempt to boost the learning capability of machine learning algorithms using nonlinear transformations. However, one major challenge in its basic form is that the computational complexity and the memory requirement do not scale well with respect to the training size. Kernel approximation is commonly employed to resolve this issue. Essentially, kernel approximation is equivalent to learning an approximated subspace in the high-dimensional feature vector space induced and characterized by the kernel function. With streaming data acquisition, approximated subspaces can be constructed adaptively. Explicit feature vectors are then extracted by a transformation onto the approximated subspace and linear learning techniques can be subsequently applied. From a computational point of view, operations in kernel methods can easily be parallelized and modern infrastructures can be utilized to achieve efficient computing. Moreover, the extracted explicit feature vectors can easily be interfaced with other learning techniques.

Nyström

Kernel approximation

CUDA

Spark

GPU

Subspace learning

Classification

Författare

Yinan Yu

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Konstantinos I. Diamantaras

T.E.I. of Thessaloniki

Tomas McKelvey

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

S. Y. Kung

Princeton University

Adaptive Learning Methods for Nonlinear System Modeling

127-147
9780128129760 (ISBN)

Styrkeområden

Informations- och kommunikationsteknik

Ämneskategorier

Beräkningsmatematik

Signalbehandling

Datavetenskap (datalogi)

Fundament

Grundläggande vetenskaper

DOI

10.1016/B978-0-12-812976-0.00008-7

Mer information

Senast uppdaterat

2023-03-21