Improved Spectral Norm Regularization for Neural Networks
Paper in proceeding, 2023
Our results demonstrate that this improved spectral regularization scheme outperforms L2-regularization as well as the previously used upper bounding technique. Moreover, our results suggest that exact spectral norm regularization and exact Frobenius norm regularization have comparable performance. We analyze these empirical findings in the light of the mathematical relations that hold between the spectral and the Frobenius norms. Lastly, in light of our evaluation we revisit an argument concerning the strong adversarial protection that Jacobian regularization provides and show that it can be misleading.
In summary, we propose a new regularization method and contribute to the practical and theoretical understanding of when one regularization method should be preferred over another.
Jacobian regularization
robustness
Deep Learning
Author
Anton Johansson
Chalmers, Mathematical Sciences, Applied Mathematics and Statistics
Niklas Engsner
Chalmers, Computer Science and Engineering (Chalmers), Data Science
Claes Strannegård
Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI
Petter Mostad
Chalmers, Mathematical Sciences, Applied Mathematics and Statistics
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
03029743 (ISSN) 16113349 (eISSN)
Vol. 13890 LNCS 181-201978-3-031-33498-6 (ISBN)
Umeå, Sweden,
Subject Categories
Computer Science
DOI
10.1007/978-3-031-33498-6_13