Care colleghe e cari colleghi segnalo la scuola estiva di Mathematics for Machine Learning, che si terrà all'ICTP di Trieste dal 15 al 26 giugno 2026, sito web: https://indico.ictp.it/event/11148 Alcuni dettagli qui sotto: -------------------------------------------------------------------------------------------- Lecturers: Francesca Mignacco (Princeton University) Andrea Montanari (Stanford University) Course by Andrea Montanari This summer school provides an introduction to theoretical ideas that have been developed with the objective of understanding machine learning methods and their domain of applicability. The focus will be on proof technique and general mathematical tools. 1.Empirical risk minimization and empirical process theory. Objective: Understand how learning is formalized in terms of generalization guarantees, and how the latter emerge from uniform control over suitably restricted function classes. a.Uniform convergence guarantees for learning b.Complexity bounds for neural networks c.Tools: Strong laws of large numbers, concentration of measure, Radamacher complexity. 2.Generalization in the linear regime. Objective: Understand how and when good generalization can be achieved while violating the (naive) assumptions of uniform convergence. In particular, characterize the generalization error of kernel and random features methods. a.Interpolation and benign overfitting b.Kernel methods c.Random features and neural tangent models d.Tools: Random matrix theory 3.Feature learning in large networks. Objective: Analysis of simple neural network models (two-layer network) outside the linear/neural tangent regime. a.Mean field theory b.Multi-index models c.Tools: Propagation of chaos, Wasserstein gradient flows. 4.Sampling and generative methods. Objective: Describe and justify mathematically architecture for generative models. Presents correctness guarantees, as well as limits in some simple settings. a.Autoregressive models, diffusion models, probability flows. b.Sampling guarantees under assumptions on the score estimation. c.Impossibility results. d.Tools: Stochastic localization, functional inequalities, computation-information gaps. Course by Francesca Mignacco The lectures will present an overview on the statistical physics approach to high-dimensional learning problems. The main concepts and methods from mean-field theory of disordered systems will be introduced through prototypical examples that are amenable to analytic characterization. More precisely, the course syllabus will be as follows: 1. Motivation and background. Objective: Introduce the statistical physics perspective on high-dimensional learning problems and the main concepts and settings covered throughout the course. (a) The jargon of statistical physics: thermodynamic limit, order parameters, typical-case scenario. (b) The perceptron. (c) The teacher-student paradigm. 2. Statics of learning. Objective: Understand the Bayes-learning framework and how to apply it to study the impact of data structure and architectural bias on the performance. (a) Generalization error in the perceptron: Bayes-optimal performance vs empirical risk minimization. Introduction to the replica method. (b) Memory capacity: from random points to neural manifolds. Applications to deep learning and neuroscience. (c) Models of data structure. (d) Deep linear networks and back-propagating kernel renormalization 3. Dynamics of learning. Objective: Understand how to derive effective low-dimensional descriptions of the learning dynamics and apply them to study commonly-used training algorithms such as stochastic gradient descent. (a) Online learning in two-layer neural networks. (b) Batch learning: dynamical mean-field theory, cavity method and path integral formulation. Participants: The school can be of interest for a very broad range of students basically from all areas of Pure and Applied Mathematics. Indeed, on one hand the topics of the school are specially addressed to those that are working in disciplines directly involved with Machine Learning (such as mathematical statistics, probability, information theory, numerical analysis, statistical mechanics); on the other hand, the lectures are developed starting from a limited requested background in these specific disciplines, so that students with good mathematical foundations in other areas could attend as well. Hence this school could provide the occasion for students with a different background to get into an exciting new area of research, either to find new connections at the theoretical level or to gain further skills which are on extremely high-demand at every level. The school may also be suitable for students who are not enrolled in a mathematical PhD but are working in related areas, such as statistics, computer science, engineering. Organizing committee: Claudio Arezzo (ICTP), Jean Barbier (ICTP), Filippo Bracci (Roma Tor Vergata and INdAM), Domenico Marinucci (Rome Tor Vergata and INdAM), Cristina Trombetti (Napoli Federico II and INdAM), ---------------------------------------------------------------------------------------------------------------- Grazie per l'attenzione, Domenico Marinucci -- @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ Domenico Marinucci Dipartimento di Matematica Università di Roma Tor Vergata https://sites.google.com/view/domenicomarinucci @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@