A Unified View on PAC-Bayes Bounds for Meta-Learning
Paper i proceeding, 2022

Meta learning automatically infers an inductive bias, that includes the hyperparameter of the baselearning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environmentlevel and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PACBayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning.

Författare

Arezou Rezazadeh

Chalmers, Elektroteknik, Kommunikation, Antenner och Optiska Nätverk

Proceedings of Machine Learning Research

26403498 (eISSN)

Vol. 162 18576-18595

International Conference on Machine Learning
Baltimore, Maryland, USA,

Ämneskategorier

Reglerteknik

Signalbehandling

Datavetenskap (datalogi)

Mer information

Senast uppdaterat

2023-10-27