Buonasera,
A seguito del voto, l'argomento scelto per questa nuova edizione del Reading Course sarà /Riemannian Optimization/; abbiamo impostato una bozza di programma e contattato alcuni di voi per iniziare (in modo da concedere il tempo per preparare il materiale): il primo incontro sarà il 17/11 alle 16:00, in Aula Riunioni.
Trovate tutti i dettagli qui: https://numpi.dm.unipi.it/reading-course-in-numerical-analysis/
C'è ancora spazio per candidarsi a tenere gli altri incontri.
A presto!
Fabio, Leonardo e Stefano.
Carə tuttə,
Vi scriviamo perché abbiamo l'intenzione di ripartire con una nuova edizione del Reading Course Analisi Numerica. Si tratta di una serie di incontri dove verranno presentati alcuni risultati recenti vicini ad argomenti di ricerca di interesse, solitamente da studentə e/o dottorandə. Ogni iterazione del corso è dedicata ad un argomento specifico, e può essere seguito in modo indipendente dagli altri. Le proposta per questo semestre è una tra:
- Ottimizzazione Riemanniana,
- Operatori di Koopman.
Aiutateci a decidere tra le due compilando il seguente form (su cui trovate anche alcune informazioni bibliografiche e contenutistiche sulle due proposte):
- https://forms.gle/QfHpfE1WdjJH9Z6Q8
Se avete idee riguardo altri possibili argomenti che volete suggerire potete inserirle all'interno del modulo, insieme a delle preferenze orientative per il giorno di svolgimento.
Trovate ulteriori informazioni sulle passate edizioni sul sito:
- https://numpi.dm.unipi.it/reading-course-in-numerical-analysis/
E potete restare aggiornati sugli sviluppi iscrivendovi alla mailing list:
- https://lists.dm.unipi.it/postorius/lists/reading-num.lists.dm.unipi.it/
A parte queste mail di annuncio, le informazioni riguardanti l'organizzazione circoleranno /unicamente/ sulla mailing list dedicata, per cui se siete interessati a seguire il corso iscrivetevi lì.
Avete tempo per esprime le vostre preferenze fino a *Mercoledì 18 Ottobre* (incluso).
Cordiali saluti,
Fabio Durastante, Stefano Massei & Leonardo Robol
Title: On the influence of stochastic rounding bias in implementing gradient descent with applications in low-precision training,
Speaker(s): Mrs Lu Xia, Eindhoven University of Technology,
Date and time: 18 Jul 2023, 14:00 (Europe/Rome),
Lecture series: Seminar on Numerical Analysis,
Venue: Dipartimento di Matematica (Aula Magna).
You can access the full event here: https://events.dm.unipi.it/e/201
Abstract
--------
In the context of low-precision computation for the training of neural networks with thegradient descent method (GD), the occurrence of deterministic rounding errors often leadsto stagnation or adversely affects the convergence of the optimizers. The employ-ment of unbiased stochastic rounding (SR) may partially capture gradient updates thatare lower than the minimum rounding precision, with a certain probability. Weprovide a theoretical elucidation for the stagnation observed in GD when training neuralnetworks with low-precision computation. We analyze the impact of floating-point round-off errors on the convergence behavior of GD with a particular focus on convex problems.Two biased stochastic rounding methods, signed-SR$_\varepsilon$ and SR$_\varepsilon$, are proposed, which havebeen demonstrated to eliminate the stagnation of GD and to result in significantly fasterconvergence than SR in low-precision floating-point computation.We validate our theoretical analysis by training a binary logistic regression model onthe Cifar10 database and a 4-layer fully-connected neural network model on the MNISTdatabase, utilizing a 16-bit floating-point representation and various rounding techniques.The experiments demonstrate that signed-SR$_\varepsilon$ and SR$_\varepsilon$ may achieve higher classificationaccuracy than rounding to the nearest (RN) and SR, with the same number of trainingepochs. It is shown that a faster convergence may be obtained by the new roundingmethods with 16-bit floating-point representation than by RN with 32-bit floating-pointrepresentation.
--
Indico :: Email Notifier
https://events.dm.unipi.it/e/201
Title: On the influence of stochastic rounding bias in implementing gradient descent with applications in low-precision training,
Speaker(s): Mrs Lu Xia, Eindhoven University of Technology,
Date and time: 18 Jul 2023, 14:00 (Europe/Rome),
Lecture series: Seminar on Numerical Analysis,
Venue: Dipartimento di Matematica (Aula Magna).
You can access the full event here: https://events.dm.unipi.it/e/201
Abstract
--------
In the context of low-precision computation for the training of neural networks with thegradient descent method (GD), the occurrence of deterministic rounding errors often leadsto stagnation or adversely affects the convergence of the optimizers. The employ-ment of unbiased stochastic rounding (SR) may partially capture gradient updates thatare lower than the minimum rounding precision, with a certain probability. Weprovide a theoretical elucidation for the stagnation observed in GD when training neuralnetworks with low-precision computation. We analyze the impact of floating-point round-off errors on the convergence behavior of GD with a particular focus on convex problems.Two biased stochastic rounding methods, signed-SR$_\varepsilon$ and SR$_\varepsilon$, are proposed, which havebeen demonstrated to eliminate the stagnation of GD and to result in significantly fasterconvergence than SR in low-precision floating-point computation.We validate our theoretical analysis by training a binary logistic regression model onthe Cifar10 database and a 4-layer fully-connected neural network model on the MNISTdatabase, utilizing a 16-bit floating-point representation and various rounding techniques.The experiments demonstrate that signed-SR$_\varepsilon$ and SR$_\varepsilon$ may achieve higher classificationaccuracy than rounding to the nearest (RN) and SR, with the same number of trainingepochs. It is shown that a faster convergence may be obtained by the new roundingmethods with 16-bit floating-point representation than by RN with 32-bit floating-pointrepresentation.
--
Indico :: Email Notifier
https://events.dm.unipi.it/e/201
Inoltro anche su questa lista, visto che l'argomento può essere
interessante.
Un saluto a tutti!
Federico
-------- Forwarded Message --------
Subject: [Random] (PMS)^2 talk - Cazzaniga - July 3 at 17.00
Date: Tue, 27 Jun 2023 15:48:24 +0200
From: Carlo Orrieri <carlo.orrieri(a)unipv.it>
To: random(a)fields.dm.unipi.it
Dear colleagues,
We are happy to announce the following *hybrid - that is, in person with
online streaming -* talk:
Speaker: *Alberto Cazzaniga*** (Area Science Park)
Title: What is the probability that a random symmetric tensor is close
to rank-one?
*Abstract*: We address the problem of estimating the probability that a
real symmetric tensor is close to rank-one tensors, motivated by the
many applications of low-rank approximation. We discuss how the question
can be addressed by studying metric invariants of the real Veronese
variety thanks to the Weyl's tube formula. We describe the role of the
reach and the curvature coefficients with respect to the Bombieri-Weyl
metric in obtaining an explicit estimate, and outline the main ideas
employed to tackle their calculation. We conclude by discussing some
asymptotic results for the case of rational normal curves. Based on
joint work with A. Lerario and A. Rosana.
Date and time: *Monday July 3, 17:00-18:00 (Rome time zone)*
Place: Laboratorio didattico*, dipartimento di matematica
dell’università di Pavia, via Ferrata 5, Pavia.*
Entra nella riunione in Zoom
https://us02web.zoom.us/j/81154155086?pwd=Rit2Ymd3eE1lUWIrc0ErQlRMcGNVdz09
<https://us02web.zoom.us/j/81154155086?pwd=Rit2Ymd3eE1lUWIrc0ErQlRMcGNVdz09>
ID riunione: 811 5415 5086
Passcode: 317594
This talk is part of the
*(PMS)^2: Pavia-Milano Seminar series on Probability and Mathematical
Statistics*
organized jointly by the universities Milano-Bicocca, Pavia,
Milano-Politecnico.
Participation is free and welcome!
Best regards
The organizers (Carlo Orrieri, Maurizia Rossi, Margherita Zanella)
Title: A Tensor Gradient Cross for Hamilton-Jacobi-Bellman equations,
Speaker(s): Luca Saluzzi, Scuola Normale Superiore, Pisa,
Date and time: 5 Jun 2023, 16:00 (Europe/Rome),
Lecture series: Seminar on Numerical Analysis,
Venue: Dipartimento di Matematica (Aula Seminari).
You can access the full event here: https://events.dm.unipi.it/e/199
Abstract
--------
Hamilton-Jacobi-Bellman (HJB) equation plays a central role in optimal control and differential games, enabling the computation of robust controls in feedback form. The main disadvantage for this approach depends on the so-called curse of dimensionality, since the HJB equation and the dynamical system live in the same, possibly high dimensional, space. In this talk, I will present a data-driven method for approximating high-dimensional HJB equations based on tensor decompositions. The approach presented in this talk is based on the knowledge of the value function and its gradient on sample points and on a tensor train decomposition of the value function. The collection of the data will be derived by two possible techniques: Pontryagin Maximum Principle and State-Dependent Riccati Equations. The numerical experiments will demonstrate an at most linear complexity in the dimension and a better stability in presence of noise. Moreover, I will present an application to an agent-based model and a comparison with Deep Learning techniques. Finally, time permitting, I will consider the coupling of the proposed method with Model Order Reduction techniques and their application to boundary feedback control for the Navier-Stokes equations.
--
Indico :: Email Notifier
https://events.dm.unipi.it/e/199
Title: Stochastic probing methods for estimating the trace of functions of sparse symmetric matrices,
Speaker(s): Michele Rinelli, Scuola Normale Superiore,
Date and time: 30 May 2023, 16:00 (Europe/Rome),
Lecture series: Seminar on Numerical Analysis,
Venue: Dipartimento di Matematica (Aula Riunioni).
You can access the full event here: https://events.dm.unipi.it/e/198
Abstract
--------
We consider the combination of two approaches for the trace estimation of a symmetric matrix function f(A) when the only feasible operations are matrix-vector products and quadratic forms with f(A): stochastic estimators, such as the Hutchinson estimator and its refined variants Hutch++ and the recent XTrace, and probing methods based on graph colorings. Particularly effective is the case where we replace the indicator vectors for the coloring used in probing by random vectors whose non-zero entries have Rademacher distribution. A theoretical analysis exposes conditions under which using just one Rademacher probing vector per color is provably better than the classical probing approach. Numerical experiments show that existing methods are also outperformed under suitable conditions on the sparsity pattern of A and on the spectrum of f(A). This talk is based on a joint work with Andreas Frommer and Marcel Schweitzer.
--
Indico :: Email Notifier
https://events.dm.unipi.it/e/198
Title: Stochastic probing methods for estimating the trace of functions of sparse symmetric matrices,
Speaker(s): Michele Rinelli, Scuola Normale Superiore,
Date and time: 30 May 2023, 16:00 (Europe/Rome),
Lecture series: Seminar on Numerical Analysis,
Venue: Dipartimento di Matematica (Aula Riunioni).
You can access the full event here: https://events.dm.unipi.it/e/198
Abstract
--------
We consider the combination of two approaches for the trace estimation of a symmetric matrix function f(A) when the only feasible operations are matrix-vector products and quadratic forms with f(A): stochastic estimators, such as the Hutchinson estimator and its refined variants Hutch++ and the recent XTrace, and probing methods based on graph colorings. Particularly effective is the case where we replace the indicator vectors for the coloring used in probing by random vectors whose non-zero entries have Rademacher distribution. A theoretical analysis exposes conditions under which using just one Rademacher probing vector per color is provably better than the classical probing approach. Numerical experiments show that existing methods are also outperformed under suitable conditions on the sparsity pattern of A and on the spectrum of f(A). This talk is based on a joint work with Andreas Frommer and Marcel Schweitzer.
--
Indico :: Email Notifier
https://events.dm.unipi.it/e/198
Dear all,
I am forwarding this announcement in case anyone is interested.
All the best,
Paola
-------- Forwarded Message --------
Subject: [DKIM Failed - Firma mittente non verificata][gdrmoa] POSTDOC
at AROMATH, Inria of Université Côte d'Azur.
Date: Tue, 23 May 2023 10:16:57 +0200
From: Bernard Mourrain <Bernard.Mourrain(a)inria.fr>
Reply-To: Bernard Mourrain <Bernard.Mourrain(a)inria.fr>
To: gdrmoa(a)listes.math.cnrs.fr
Dear Colleagues,
A POSTDOC position is available in Aromath
<https://es.sonicurlprotection-fra.com/click?PV=2&MSGID=20230523083043027614…>
team at Inria of Université Côte d'Azur
<https://es.sonicurlprotection-fra.com/click?PV=2&MSGID=20230523083043027614…>.
The duration of the contract is _1+1 years (one year renewable one
additional year)_.
The starting date is flexible, ideally in fall 2023.
Here is the link to the official announcement and to apply
<https://es.sonicurlprotection-fra.com/click?PV=2&MSGID=20230523083043027614…>
to the position.
Please feel free to contact me for any questions you may have.
Please also forward this message to any potential interested candidates
you may know.
Best regards,
Bernard Mourrain
Carissimi,
vi inoltro il seguente annuncio di seminario, nell'ambito delle
manifestazioni d'interesse del Progetto di Eccellenza.
Saluti, Beatrice
-------- Forwarded Message --------
Subject: [Personale.docente.dm] Prossimi seminari Dipartimento di
Eccellenza
Date: Mon, 22 May 2023 07:54:42 +0000
From: Maria Stella Gelli <maria.stella.gelli(a)unipi.it>
To: personale.docente(a)dm.unipi.it <personale.docente(a)dm.unipi.it>
Cari tutti,
vi segnalo che venerdì alle ore 11.00 in Aula Magna si terrà il quarto
dei seminari collegati alla manifestazione di interesse
del Progetto di Eccellenza.
Lo speaker è Luca Heltai (Sissa)
https://www.dm.unipi.it/eventi/an-overview-on-non-matching-approximation-me…
<https://www.dm.unipi.it/eventi/an-overview-on-non-matching-approximation-me…>
I seminari proseguiranno nelle prossime settimane, potete seguire il
calendario (in continua evoluzione) al seguente link
https://www.dm.unipi.it/seminari-di-dipartimento/
<https://www.dm.unipi.it/seminari-di-dipartimento/>
Vi segnaliamo sul canale Colloquia del Team Dipartimento di Matematica
trovate la registrazione dei seminari passati, inoltre, a meno di
problematiche dell'ultimo minuto, i seminari sono trasmessi in
streaming sullo stesso canale.
Cari saluti
Maria Stella Gelli & Francesco Sala per il Direttore