Dear all,
On Monday, October 6, at 14h00 in Aula Dal Passo of Tor Vergata Math Department, RoMaDS (https://www.mat.uniroma2.it/~rds/events.php) will host
Federico Bassetti (Politecnico di Milano) with the seminar
"Scaling Limits of Bayesian Neural Networks: Gaussian Processes and Mixtures”
Abstract: In large neural networks, key theoretical insights emerge in the infinite-width limit, where the number of neurons per layer grows while depth stays fixed. In this regime, networks with Gaussian-initialized weights define a mixture of Gaussian processes with random covariance, which converges in the infinite-width limit to a pure Gaussian process with deterministic covariance. However, this Gaussian limit sacrifices descriptive power, as it lacks the ability to learn dependent features and produce output correlations that reflect observed labels. Motivated by these limitations, we explore deep linear networks in the proportional limit, where both depth and width diverge at a fixed ratio. In this setting, the network converges to a nontrivial Gaussian mixture, both at the prior and posterior level. This structure allows the network to capture dependencies in the outputs—an ability lost in the infinite-width limit but retained in finite networks. Our contribution extends previous works by explicitly characterizing, for linear activation functions, the limiting distribution as a nontrivial mixture of Gaussians. The talk is based on
F. Bassetti, L. Ladelli, P. Rotondo. Proportional infinite-width infinite-depth limit for deep linear neural networks (2024+) https://arxiv.org/abs/2411.15267
Bassetti,F. Gherardi, M. Ingrosso, A. Pastore, M. Rotondo, P. Feature learning in finite-width Bayesian deep linear networks with multiple outputs and convolutional layers. Journal of Machine Learning Research 26 (2025) 1-35.
We encourage in-person partecipation. Should you be unable to come, here is the link to the Teams streaming:
https://teams.microsoft.com/l/message/19:rfsL73KX-fw86y1YnXq2nk5VnZFwPU-iIPE...
The seminar is part of the Excellence Project MatMod@TOV.