Generalization Bounds via Information Density and Conditional Information Density
Artikel i vetenskaplig tidskrift, 2020

We present a general approach, based on an exponential inequality, to derive bounds on the generalization error of randomized learning algorithms. Using this approach, we provide bounds on the average generalization error as well as bounds on its tail probability, for both the PAC-Bayesian and single-draw scenarios. Specifically, for the case of sub-Gaussian loss functions, we obtain novel bounds that depend on the information density between the training data and the output hypothesis. When suitably weakened, these bounds recover many of the information-theoretic bounds available in the literature. We also extend the proposed exponential-inequality approach to the setting recently introduced by Steinke and Zakynthinou (2020), where the learning algorithm depends on a randomly selected subset of the available training data. For this setup, we present bounds for bounded loss functions in terms of the conditional information density between the output hypothesis and the random variable determining the subset choice, given all training data. Through our approach, we recover the average generalization bound presented by Steinke and Zakynthinou (2020) and extend it to the PAC-Bayesian and singledraw scenarios. For the single-draw scenario, we also obtain novel bounds in terms of the conditional α-mutual information and the conditional maximal leakage.

PAC-Bayes

Information Theory

Statistical Learning Theory

Författare

Fredrik Hellström

Chalmers, Elektroteknik, Kommunikation, Antenner och Optiska Nätverk

Giuseppe Durisi

Chalmers, Elektroteknik, Kommunikation, Antenner och Optiska Nätverk

IEEE Journal on Selected Areas in Information Theory

26418770 (eISSN)

Vol. 1 3 824-839 3040992

INNER: information theory of deep neural networks

Chalmers AI-forskningscentrum (CHAIR), 2019-01-01 -- 2021-12-31.

Styrkeområden

Informations- och kommunikationsteknik

Ämneskategorier

Beräkningsmatematik

Sannolikhetsteori och statistik

Diskret matematik

Matematisk analys

DOI

10.1109/JSAIT.2020.3040992

Mer information

Senast uppdaterat

2024-04-04