Dear all,


Also this Winter RoMaDS ( https://www.mat.uniroma2.it/~rds/events.php ) will offer a mini-course on the mathematics of machine learning at Tor Vergata University in Rome. This year we are very happy to host Giovanni Conforti (Università di Padova & École Polytechnique, Paris) and Alain Durmus (École Polytechnique, Paris) from the 19th to the 22nd of February. They will present a 9 hours mini-course titled 


An introduction to Score-based Generative Models


The course will be addressed also to non-experts of the field, it will be divided in 3 slots of 3 hours each and the tentative program is as follows (see the whole abstract at the end of the email): 



You can find the whole Syllabus with more details on each lecture on RoMaDS webpage: https://www.mat.uniroma2.it/~rds/Files/SyllabusSGM.pdf


All the lectures will be held in Aula Dal Passo, Department of Mathematics - Tor Vergata UniversityWe do encourage in-person participation. Should you unable to come, here is the link to the event on Teams:


https://teams.microsoft.com/l/meetup-join/19%3arfsL73KX-fw86y1YnXq2nk5VnZFwPU-iIPEmqet8NCg1%40thread.tacv2/1705929161715?context=%7b%22Tid%22%3a%2224c5be2a-d764-40c5-9975-82d08ae47d0e%22%2c%22Oid%22%3a%22650fc4a8-4cec-4bd2-87bc-90d134074fe6%22%7d


The seminar is part of the Excellence Project MatMod@TOV.


Best,


Michele Salvi



Abstract of the course “An introduction to Score-based Generative Models”


In simple words, generative modeling consists in learning a map capable of generating new data instances that resemble a given set of observations, starting from a simple prior distribution, most often a standard Gaussian distribution. This course aims at providing a mathematical introduction to generative models and in particular to Score-based Generative Models (SGM). SGMs have gained prominence for their ability to generate realistic data across diverse domains, making them a popular tool for researchers and practitioners in machine learning. Participants will learn about the methodological and theoretical foundations, as well as some practical applications associated with these models.

The first two lectures motivate the use of generative models, introduce their formalism and present two simple though relevant examples: energy-based models and Generative Adversarial Networks. In the third and fourth lecture we present score-based diffusion models and explain how they provide an algorithmical framework to the basic idea that sampling from the time-reversal of a diffusion process converts noise into new data instances. We shall do so following two different approaches: a first elementary one that only relies on discrete transition probabilities, and a second one based on stochastic calculus. After this introduction, we derive sharp theoretical guarantees of convergence for score-based diffusion models assembling together ideas coming from stochastic control, functional inequalities and regularity theory for HamiltonJacobi-Bellman equations. The course ends with an overview of some of the most recent and sophisticated algorithms such as flow matching and diffusion Schrödinger bridges (DSB), which bring an (entropic) optimal transport insight into generative modeling.