Generalization Bounds via Information Density and Conditional Information Density
Journal article, 2020

We present a general approach, based on an exponential inequality, to derive bounds on the generalization error of randomized learning algorithms. Using this approach, we provide bounds on the average generalization error as well as bounds on its tail probability, for both the PAC-Bayesian and single-draw scenarios. Specifically, for the case of sub-Gaussian loss functions, we obtain novel bounds that depend on the information density between the training data and the output hypothesis. When suitably weakened, these bounds recover many of the information-theoretic bounds available in the literature. We also extend the proposed exponential-inequality approach to the setting recently introduced by Steinke and Zakynthinou (2020), where the learning algorithm depends on a randomly selected subset of the available training data. For this setup, we present bounds for bounded loss functions in terms of the conditional information density between the output hypothesis and the random variable determining the subset choice, given all training data. Through our approach, we recover the average generalization bound presented by Steinke and Zakynthinou (2020) and extend it to the PAC-Bayesian and singledraw scenarios. For the single-draw scenario, we also obtain novel bounds in terms of the conditional α-mutual information and the conditional maximal leakage.

PAC-Bayes

Information Theory

Statistical Learning Theory

Author

Fredrik Hellström

Chalmers, Electrical Engineering, Communication, Antennas and Optical Networks

Giuseppe Durisi

Chalmers, Electrical Engineering, Communication, Antennas and Optical Networks

IEEE Journal on Selected Areas in Information Theory

26418770 (eISSN)

Vol. 1 3 824-839 3040992

INNER: information theory of deep neural networks

Chalmers AI Research Centre (CHAIR), 2019-01-01 -- 2021-12-31.

Areas of Advance

Information and Communication Technology

Subject Categories

Computational Mathematics

Probability Theory and Statistics

Discrete Mathematics

Mathematical Analysis

DOI

10.1109/JSAIT.2020.3040992

More information

Latest update

4/4/2024 1