Controlled Decent Training
Preprint, 2023

In this work, a novel and model-based artificial neural network (ANN) training method is developed supported by optimal control theory. The method augments training labels in order to robustly guarantee training loss convergence and improve training convergence rate. Dynamic label augmentation is proposed within the framework of gradient descent training where the convergence of training loss is controlled. First, we capture the training behavior with the help of empirical Neural Tangent Kernels (NTK) and borrow tools from systems and control theory to analyze both the local and global training dynamics (e.g. stability, reachability). Second, we propose to dynamically alter the gradient descent training mechanism via fictitious labels as control inputs and an optimal state feedback policy. In this way, we enforce locally H2 optimal and convergent training behavior. The novel algorithm, Controlled Descent Training (CDT), guarantees local convergence. CDT unleashes new potentials in the analysis, interpretation, and design of ANN architectures. The applicability of the method is demonstrated on standard regression and classification problems.

label augmentation

gradient descent training

Neural Tangent Kernel

optimal control

locally convergent learning

Author

Viktor Andersson

Chalmers, Electrical Engineering, Systems and control

Balázs Varga

Chalmers, Electrical Engineering, Systems and control

Vincent Szolnoky

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Andreas Syren

Rebecka Jörnsten

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Balázs Adam Kulcsár

Chalmers, Electrical Engineering, Systems and control

Robustly and Optimally Controlled Training Of neural Networks I (OCTON I)

Centiro, 2019-10-15 -- 2023-10-15.

Robustly and Optimally Controlled Training Of neural Networks II (OCTON II)

Centiro, 2020-05-01 -- 2025-04-30.

Subject Categories

Computer Engineering

Control Engineering

Areas of Advance

Transport

Related datasets

DOI: arXiv:2303.09216

More information

Latest update

10/6/2023