Improved Spectral Norm Regularization for Neural Networks
Paper i proceeding, 2023
Our results demonstrate that this improved spectral regularization scheme outperforms L2-regularization as well as the previously used upper bounding technique. Moreover, our results suggest that exact spectral norm regularization and exact Frobenius norm regularization have comparable performance. We analyze these empirical findings in the light of the mathematical relations that hold between the spectral and the Frobenius norms. Lastly, in light of our evaluation we revisit an argument concerning the strong adversarial protection that Jacobian regularization provides and show that it can be misleading.
In summary, we propose a new regularization method and contribute to the practical and theoretical understanding of when one regularization method should be preferred over another.
Jacobian regularization
robustness
Deep Learning
Författare
Anton Johansson
Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik
Niklas Engsner
Chalmers, Data- och informationsteknik, Data Science
Claes Strannegård
Chalmers, Data- och informationsteknik, Data Science och AI
Petter Mostad
Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
03029743 (ISSN) 16113349 (eISSN)
Vol. 13890 LNCS 181-201978-3-031-33498-6 (ISBN)
Umeå, Sweden,
Ämneskategorier
Datavetenskap (datalogi)
DOI
10.1007/978-3-031-33498-6_13