Dear all, 

On Wednesday, May 22nd, at 14h30 in Aula Dal Passo of Tor Vergata Math Department, RoMaDS (https://www.mat.uniroma2.it/~rds/events.php) will host 

Dario Trevisan (Università di Pisawith the seminar 

"Gaussian Approximation and Bayesian Posterior Distribution in Random Deep Neural Networks

Abstract: We establish novel rates for the Gaussian approximation of randomly initialized deep neural networks with Gaussian parameters and Lipschitz activation functions, in the so-called wide limit, i.e., where the sizes of all hidden layers become large. Using the Wasserstein metric and related functional analytic tools, we demonstrate the distribution of the output of a network and the corresponding Gaussian approximation are at a distance that scales inversely with the width of the network, surpassing previously established rates.
Furthermore, we extend our findings to approximate the exact Bayesian posterior distribution of the network when the likelihood is a bounded Lipschitz function of the network output, on a finite training set. This includes common cases, such as the Gaussian likelihood, which is defined as the exponential of the negative mean squared error. Our inequalities thus shed light on the network's Gaussian behavior by quantitatively capturing the distributional convergence results in the wide limit.
The exposition will aim to be self-contained, by introducing all the basic concepts related to artificial neural networks and Bayesian statistics to a mathematical audience. Based on arXiv:2203.07379 (joint with A. Basteri) and arXiv:2312.11737.

We encourage in-person partecipation. Should you be unable to come, here is the link to the Teams streaming:



The seminar is part of the Excellence Project MatMod@TOV.