Dear Colleagues,
We would like to invite you to the following SPASS seminar, jointly organized by UniPi, SNS, UniFi and UniSi: *A two-scale complexity measure for stochastic neural networks * by Massimiliano Datres (Università di Trento)
The seminar will take place on TUE, 12.12.2023 at 14:00 CET in Aula Seminari, Dipartimento di Matematica, UNIPI and streamed online at the link below.
The organizers, A. Agazzi, G. Bet, A. Caraceni, F. Grotto, G. Zanco https://sites.google.com/unipi.it/spass -------------------------------------------- *Abstract: **Over-parametrized deep learning models are achieving outstanding performances in solving several complex tasks such as image classification problems, object detection and natural language processing. Despite the risk of overfitting, these parametric models show impressive generalization after training. Hence, defining appropriate complexity measures becomes crucial for understanding and quantifying the generalization capabilities of deep learning models. In this talk, I will introduce a new notion of complexity measure, called two-scale effective dimension (2sED), which is a box-covering dimension related to a metric induced by the Fisher information matrix of the parametric model. I will then show how the 2sED can be used to derive a generalization bound. Furthermore, I present an approximation of the 2sED for Markovian models, called lower 2sED, that can be computed sequentially layer-by-layer with less computational demands. Finally, I present experimental evidence that the post-training performance of given parametric models is related both with 2sED and the lower 2sED.*