Precise Asymptotic Analysis of Deep Random Feature Models
Paper in proceeding, 2023

We provide exact asymptotic expressions for the performance of regression by an L−layer deep random feature (RF) model, where the input is mapped through multiple random embedding and non-linear activation functions. For this purpose, we establish two key steps: First, we prove a novel universality result for RF models and deterministic data, by which we demonstrate that a deep random feature model is equivalent to a deep linear Gaussian model that matches it in the first and second moments, at each layer. Second, we make use of the convex Gaussian Min-Max theorem multiple times to obtain the exact behavior of deep RF models. We further characterize the variation of the eigendistribution in different layers of the equivalent Gaussian model, demonstrating that depth has a tangible effect on model performance despite the fact that only the last layer of the model is being trained.

Learning Curves

Convex Gaussian Min Max Theorem

Asymptotic Analysis

Random Features Model

Universality

Author

David Bosch

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Ashkan Panahi

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Babak Hassibi

California Institute of Technology (Caltech)

Proceedings of Machine Learning Research

26403498 (eISSN)

Vol. 195 4132-4179

36th Annual Conference on Learning Theory, COLT 2023
Bangalore, India,

Subject Categories

Probability Theory and Statistics

Computer Science

More information

Latest update

9/29/2023