SwitchPath: Enhancing Exploration in Neural Networks Learning Dynamics
Paper in proceeding, 2025

We introduce SwitchPath, a novel stochastic activation function that enhances neural network exploration, performance, and generalization, by probabilistically toggling between the activation of a neuron and its negation. SwitchPath draws inspiration from the analogies between neural networks and decision trees, and from the exploratory and regularizing properties of DropOut as well. Unlike Dropout, which intermittently reduces network capacity by deactivating neurons, SwitchPath maintains continuous activation, allowing networks to dynamically explore alternative information pathways while fully utilizing their capacity. Building on the concept of ϵ-greedy algorithms to balance exploration and exploitation, SwitchPath enhances generalization capabilities over traditional activation functions. The exploration of alternative paths happens during training without sacrificing computational efficiency. This paper presents the theoretical motivations, practical implementations, and empirical results, showcasing all the described advantages of SwitchPath over established stochastic activation mechanisms.

Deep Learning Theory

Deep Neural Network Algorithms

Author

Antonio Di Cecco

G. d'Annunzio University of Chieti-Pescara

Andrea Papini

University of Gothenburg

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Carlo Metta

Istituto di Scienza e Tecnologie dell'Informazione A. Faedo

Marco Fantozzi

University of Parma

Silvia Giulia Galfré

University of Pisa

Francesco Morandin

University of Parma

Maurizio Parton

G. d'Annunzio University of Chieti-Pescara

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 15243 LNAI 275-291
9783031789762 (ISBN)

27th International Conference on Discovery Science, DS 2024
Pisa, Italy,

Subject Categories (SSIF 2025)

Computer Sciences

Computer Engineering

DOI

10.1007/978-3-031-78977-9_18

More information

Latest update

11/17/2025