Improved Spectral Norm Regularization for Neural Networks
Paper in proceeding, 2023

We improve on a line of research that seeks to regularize the spectral norm of the Jacobian of the input-output mapping for deep neural networks. While previous work rely on upper bounding techniques, we propose a scheme that targets the exact spectral norm. We evaluate this regularization method empirically with respect to its generalization performance and robustness.

Our results demonstrate that this improved spectral regularization scheme outperforms L2-regularization as well as the previously used upper bounding technique. Moreover, our results suggest that exact spectral norm regularization and exact Frobenius norm regularization have comparable performance. We analyze these empirical findings in the light of the mathematical relations that hold between the spectral and the Frobenius norms. Lastly, in light of our evaluation we revisit an argument concerning the strong adversarial protection that Jacobian regularization provides and show that it can be misleading.

In summary, we propose a new regularization method and contribute to the practical and theoretical understanding of when one regularization method should be preferred over another.

Jacobian regularization

robustness

Deep Learning

Author

Anton Johansson

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Niklas Engsner

Chalmers, Computer Science and Engineering (Chalmers), Data Science

Claes Strannegård

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Petter Mostad

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 13890 LNCS 181-201
978-3-031-33498-6 (ISBN)

20th International Conference on Modeling Decisions for Artificial Intelligence, MDAI 2023
Umeå, Sweden,

Subject Categories

Computer Science

DOI

10.1007/978-3-031-33498-6_13

More information

Latest update

6/27/2023