A Unified View on PAC-Bayes Bounds for Meta-Learning
Paper in proceeding, 2022

Meta learning automatically infers an inductive bias, that includes the hyperparameter of the baselearning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environmentlevel and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PACBayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning.

Author

Arezou Rezazadeh

Chalmers, Electrical Engineering, Communication, Antennas and Optical Networks

Proceedings of Machine Learning Research

26403498 (eISSN)

Vol. 162 18576-18595

International Conference on Machine Learning
Baltimore, Maryland, USA,

Subject Categories

Control Engineering

Signal Processing

Computer Science

More information

Latest update

10/27/2023